Mobile robots and safety: the experience of Robotnik in HR-RECYCLER project

By | Blog | No Comments

Collaborative robots have come front and center on the international stage as they’ve become widespread in Industry 4.0. Today we have more powerful, more advanced and more productive robots, so safety has become a key element.

Safety is the key

For Robotnik, as an experimented robot manufacturer and within the collaborative environment of the HR-Recycler project, this aspect is especially important since humans and robots will be working side by side. In previous posts, we talked about the importance of safety focusing on the predictive/anticipative face, that is the signaling and how it is implemented in our AGV’s. Besides this, safety also involves aspects like crashing avoidance, reducing or stopping the robots and the comfort of humans when a robot is working around them. But… How does this really work? How is Robotnik ensuring the safety of its robots?

As explained before, Robotnik aims to ensure its robots through the accomplishment of the EN ISO 3691-4:2020, that is related to Industrial trucks — Safety requirements and verification — Part 4: Driverless industrial trucks and their systems.

For the accomplishment of the normative and besides the concrete reglaments of each country, this post aims to give to the reader a brief description about what, why and how the related to motion set of premises of the ISO will be reached.

At first, the robot should be properly designed to allow the system to reduce the velocity, stop the motion or modify its behaviour according to the environmental conditions, as it is included in the ISO mentioned above, the minimal hardware requirements in terms of ensuring the motion are:

  • Braking system: The robot needs to be equipped with a braking system able to work when the robot is switched off and also able to stop the system when actuators are out of control.
  • Speed control system: The robot needs to be equipped with a speed monitor system to send a stopping signal in case of overspeed. It also needs to be aligned and compliant with the stability of the platform.
  • Steering control system: The robot needs to control the steering angle of the actuators to monitor the stability of the robot.
  • Protective devices and complementary measures: To detect persons in the routes in automatic mode, the robots must be equipped with sensors that can detect persons and correctly installed to do that. If these devices can not work in the movement direction the maximum velocity must be less than 0.3 m/s.

Once this part is already installed in the robot and properly integrated, the configuration and the control areas must be aligned with the robot capabilities. In order to check all the components the robots have installed the safe PLC module, an adjustable modular safety controller for safety applications and the kernel of the monitoring process. The safety level actually achieved is determined by the external wiring, how the wiring is implemented, the configuration, the selection of command triggers, and how they are arranged on the machine. In our robots, the wired and interaction direction goes as follows:


The safe PLC communicates with the electromechanical brakes in order to trigger the safety and stop or reduce the velocity of the robot. The configuration depends on the design, function and working area of the robot. If personnel can access the shuttle path, certain safety conditions must be implemented in accordance with applicable standards so the robot must be able to update this safety level in real time:

  • Laser areas: The used laser scanners can switch between two kinds of areas: the warning area, where the robot reduces its maximum velocity as a prevention; and the protective field where the robot stops its behaviour if something shuts it. Both fields are configured to increase proportionally to the velocity. The minimum distance of the protective field for the lowest maximum speed limit is 0.3m.

Speed limit: The velocity limits are included as one of the conditions in the ISO to ensure that the robot will be able to stop before crashing and with a remaining distance, the limits can also be lowered to improve the stability of the base or load. The maximum velocity allowed must be less than 0.3m/s if the personnel detection systems (i.e. the laser scanners) are muted and only can be used by the specialized workers. If the robot is not working closer to people and it has large space to work the maximum speed is 1.2m/s with the personnel detection system active. The all possible velocity cases are summarized in the following table:

  • Steering angle: The direction of motion is also one of the conditions since the stability can be affected or the robot may have areas not covered by the safety.

Specific configuration

 Under the project, two types of robots were designed, the RB-Kairos+, which is an omnidirectional platform mounted with a manipulator, and the RB-Ares, which is a robotic pallet-truck.

The configuration of RB-Kairos+ has safety brakes, two accessible emergency buttons, a traction monitor system on each wheel and two 2D lasers installed and located in two of its corners. The scanners give the robot a 360º vision range of its environment. The vision range detects any obstacle located at a height of 170 mm from the ground that is the recommended height to detect the legs of the personnel. The RB-Kairos+ is configured, as standard, to activate the emergency stop by detecting an obstacle at a distance of 1000 mm. This configuration can be modified according to the environment or application to be carried out, as long as it does not compromise the safety of the users.

The robot has a predefined safety zone on the horizontal plane, determined by safety lasers detection. When an obstacle or a person is detected inside the safety zone, the RB-Kairos+ will stop until the area is free again. The size of the safety zone depends on the speed of the mobile base:

When the base is static or has a speed below 0.15 m/s, the safety zone is the one detailed in the scheme below:

This area increases with the mobile base speed until reaching the maximum dimensions with a speed of 1m/s:

Once the maximum speed of 1.3m/s is reached, the platform stops for safety reasons.

In the case of the pallet-truck, it is configured with the safety brakes, the steering and traction monitoring system and one single 2D laser installed and located in the front shape, which gives it a 270º vision range. The vision range detects any obstacle located at a height of 170 mm from the ground and it is configured to activate the emergency stop by detecting an obstacle at a distance of 250 mm.

The robot has a predefined safety zone on the horizontal plane, determined by safety laser detection. The size of the safety zone depends on the speed vector of the mobile base, that is, its module, sense and direction of motion. Several small regions are defined and only the ones that are in the direction and sense of the motion vector are active.

When the base is static or has a speed below 0.15 m/s, the safety zone is the one detailed in the scheme below:

This area increases with the mobile base speed until reaching the maximum dimensions with a speed of 1.2m/s:


Safety is a crucial part that must be kept in mind when designing an AGV that will work in a human-robot collaboration space. Within the design of the robotic solutions, Robotnik aims to achieve a total sinergy between the components to build and develop robots capable of working safely between humans, and at the same time, represents an efficient solution.

The concept of risk in the proposal for an AI Regulation

By | Blog | No Comments

The concept of risk in the proposal for an AI Regulation

The regulatory model selected for artificial intelligence in the EU allows for a categorization of risks, depending on their level. This blogpost discusses the concept of risk, as used in the proposal for an AI Act, introduced by a draft Regulation mid-2021. Generally, a risk can be expressed as a combination of the likelihood of an event and the severity of its consequences. The AI proposal, in particular, presents a major novelty in this domain; it revolves around the risk of harm to the i) health and ii) safety, or a risk of adverse impact on iii) fundamental rights. These three areas of impact (or ‘harm absorption’) do not necessarily converge with each other since they are founded in distinct scientific fields. Inspired by the theory of risk (regulation), together with modern risk assessment methods and tools in an algorithmic environment, the emphasis is put on the risk-based approach in the context of AI, comprehending its origin as well as its rationale.

It has been argued that the manner the term ‘risk’ is employed in AI regulation qualitatively differs in comparison to how risks are encapsulated in the regulatory model of the GDPR. On the one hand, the data protection framework aims to create a level playing field for better compliance with data protection principles and firmer accountability on behalf of controllers, in which the identification of risks is succeeded by an assessment and the attempt to mitigate them. On the other hand, the AI act, firstly, aspires to set thresholds for ranking the risks (and thus, the technologies) which stem from AI applications, secondly, to prohibit or allow them, and, thirdly, to regulate the AI technologies in a manner commensurate to the category of risk where they belong, introducing ‘principle-based requirements’.

Understanding what risks denote in the AI proposal is of pragmatic importance, in the area of compliance. The concept of risk is crucial in the establishment and operation of a risk management system (for high-risk AI applications) and in the setting up of an inventory of appropriate risk management measures, as well as in the requirement of human oversight which aims at preventing or minimizing the risks to health, safety or fundamental rights.  Although the ranking of AI applications (by risk level) is predefined by the draft Regulation, what is currently unclear is the way risks are identified, analysed, estimated, evaluated documented and mitigated (current Article 9 of the proposal). Such risk treatment, in the context of AI, is inextricably interrelated to the dedicated concept of risk per se and is expected to be better comprehended through the examination of its modalities.

The HR-Recycler project is directly related to the proposed Regulation’s modifications in the legal framework around artificial intelligence. Although the impact assessment which is carried out and monitored in this project concerns the fundamental rights to data protection and privacy, as well as ethics, it cannot be excluded that legal compliance would be warranted for artificial intelligence, on top of other obligations. The risks to health and safety need to be identified, systematized, assessed and mitigated where necessary; presently, risks to health and safety are not under the impact assessment’s scope.

However, the regulation of artificial intelligence is an ongoing process and no binding document exists till today. This is not expected to be in force before 2024-2025, where additional legal obligations will be introduced to cover aspects of artificial intelligence which, nowadays, are either voluntarily regulated or are developed under ethical guidelines for trustworthy applications.

Nikolaos Ioannidis

Vrije Universiteit Brussel

Towards 5.0 with robots and people

By | Blog | No Comments

The generations of robots have evolved as they have been incorporating technologies, acquiring new skills in our environments, seeking their autonomy and intelligence as their objective at the service of the activity for which they were programmed. This evolution and deployment of robots has been especially rapid in recent years also in industry. According to the World Robotics 2021 reports, there are 3 million industrial robots operating in factories around the world, an increase of 10% from 2020.

Michael Knasel, director of the Center for Robotic Applications at Science Application Inc., argued in 1984 that robots should evolve over five generations (Engelberger, J. “Robotics Today”, Robotics, pp. 59, 1979). He has more publications related to robotics and AI also. And time is proving him right. The first two, already achieved in the eighties, included the management of repetitive tasks with very limited autonomy. The third generation would include artificial vision. The fourth, advanced mobility outdoors and indoors. And the fifth would enter the domain of Artificial Intelligence. More details:

  1. First Generation: handling robots (“pick-and-place”). Robots dedicated to pick and place materials, used to complement industrial machines and are on a fixed base. They started in 1982.
  2. Second Generation: robots begin to learn. They move along a path and are capable of memorizing different movement sequences or specific and predetermined tasks. They appeared in 1984.
  3. Third Generation: robots with senses (vision and touch), AGVs. They have a more advanced control, with precision servomechanisms. They move self-guided, widely used in warehouses and internal logistics. They appear in 1989.
  4. Fourth Generation: mobile and intelligent robots. Robots with wheels or artificial legs, they begin to have a humanoid or similar shape, equipped with intelligent sensors to carry out maintenance tasks, entertainment, reception of clients, even as pets. It starts in 2000.
  5. Fifth Generation: singular robots based on artificial intelligence. Its controllers are based on advanced artificial intelligence. They are endowed with a high level of mobility and most importantly, they seek interaction with humans and are able to imitate their thinking. They start in 2010.

Being already in the fifth generation, or what could be the robot 5.0, it may be curious and even significant that we are focused on Industry 4.0, on the 4.0 worker, when it seems – at least by analogy – that they should go hand in hand these versions. What should we add to these 4.0 concepts? What advances are we giving to robots that the Smart Factory does not have? Is that plus of Artificial Intelligence the one that provides the qualitative leap to talk about Industry 5.0 and the worker/person 5.0? The answer, or at least part of it, lies in what these fifth-generation robots bring with their intelligence: interaction.

It is true that we work technologically to improve processes with Artificial Intelligence, quantum computing, IoT with 5G, etc., but the Industry 5.0 concept must go beyond a contribution or technological leap, it must include the person, being precisely the human-machine fusion a core concept. Technology must be at the service of the person, and thereby increase productivity and efficiency.

The European Commission has published a report on “Industry 5.0” where it precisely recognizes the power of industry to achieve social objectives beyond employment and growth, and where it highlights research and innovation as drivers of a transition towards an industry European centered on the human being, sustainability and resilience.

There are already projects where this change, also disruptive, is being channelled and affects the business, political and intellectual spheres. A change that brings greater well-being to society as a whole. An example is HR-Recycler, a project committed to the environment and to adapting intelligent industrial systems to the operator.

The question is to create a symbiotic, interoperable and fluid environment between intelligent systems and people with the ultimate goal of empowering the person and the consequent “humanization” of industrial processes. A hybrid human-robot environment.

Intelligent industrial systems and Artificial Intelligence themselves can contribute to the creation of fairer systems; and to reinforce the training of workers so that they occupy the new necessary profiles, thus softening the impact of the reconversion of work and empowering the person, within the new environment of Industry 5.0.

Including this new contribution will have an impact both in the production centers, as well as in the technology providers, as well as in the training centers. However, what we must not lose sight of is that the true evolution is in the intelligence that we must put in to act with criteria and reach 5.0.

Indumetal discuss digital transformation needs within the E-Waste domain

By | Blog | No Comments

Main challenge to the Industry 4.0

For many traditional, labour-intensive industries, as in the case of most European E-Waste recycling companies, the rapid digitalization of the world and the professional sphere is a situation that is difficult to embrace but one we must face. Any transformation is usually a long and tough process, being clearly the case of the digital transformation, which requires a clear strategic vision and open-minded leaders who are ready for a change in the company.

 Clearly, the 4.0 transformation takes the emphasis on implemented digital technology to a whole new level with the help of interconnectivity through the Internet of Things (IoT), access to real-time data, etc. Theoretically, having the right information at the proper time to make the correct decision is the main objective.

However, the key transformation is the connection between the physical and the digital, which allows better collaboration and access between people, partners, departments, suppliers, and products. It is a process that does not change what is done but the way of doing it.

Then, the critical change introduced by Industry 4.0 is not exactly in the production or recycling phase but in the innovation of the business model. It is a holistic approach that has human development as its axis. Recycling needs transformation in its internal culture.

New training is needed, as well as new digital skills and new professional profiles that allow all phases of the data work process to be tackled with guarantees. Moreover, the companies need to be transformed rapidly. The current feeling of vulnerability comes from this required speed, because from the transformation of an agrarian society to an industrial one the change was linear, but the current acceleration is exponential so far.

In this way, the electrical and electronical waste recycling sector needs to double speed because two main reasons: the first, due to E-Waste being the fastest growing waste stream in the world; and the second, the sector needs to be able to change its culture at the same level of other industrial sectors to survive.

The big question is: How can it be done? Some proposals are the following: investing much more in skilled people, creativity and innovation, and new markets.

This is the human challenge facing digital transformation in our industries.

Sadako shares the integration results reached during tests done in late October 2021

By | Blog | No Comments

During the past year, the HR-Recycler project has moved to the integration phase, where all the technologies developed in the early phases are implemented in real world scenarios, giving a good assessment of the progress made or still needed to reach the goals defined in the project proposal.

In this blog post, Sadako shares the integration results reached during tests done in late October 2021 in partnership with COMAU as well as the latest updates on the pilot integration tasks done in ECORESET in December 2021.

1.    Integration tests with COMAU

Throughout 2021, Sadako has perfected a Neural Network model able to detect the different type of WEEE objects encountered in the HR-Recycler project and integrated it in a real-time software specifically designed for the Classification cell. The software is able to detect and locate the objects in 3D space with a custom point cloud processing algorithm.

This software has been tested and fine-tuned during integration tests made continuously during all of 2021, with the last integration test made at COMAU’s premises in late October. This test was the occasion to measure the accuracy of the detection software as well as the capabilities of the NJ-220 robotic arm and gripper in the Classification cell scenario, as illustrated in the following sequence:

Figure 1: Right sequence: object being picked up as seen by the vision camera. Left Sequence: General view of the object being picked up

The tests were successful, as 90% of the objects used for the test were correctly identified, located, grasped, and moved to their target location. These results give a high confidence that the prototype will reach the desired performance when deployed at the ECORESET pilot site.

2.    ECORESET pilots

In December 2021, CERTH, IBEC, ROBOTNIK and SADAKO travelled to the ECORESET premises in Athens to perform a first phase of the pilot integration. During this visit, the camera framework designed by Sadako for the Disassembly workbench was installed alongside the workbench designed and built by ECORESET. This camera framework is designed to accommodate the cameras for operator gesture detection as well as the ones for object detection for cobot disassembly tasks, as shown in the following images:

Figure 2: Left image: constructed camera framework over the disassembly workbench. Right image: CAD view of the designed camera framework

This first integration step was an opportunity of successfully testing the operator location detection and the gesture detection software in their final environment, as well as for other the partners present during this visit to test their latest software and hardware developments:

Figure 3: HR Recycler partners working during the Ecoreset pilots

In particular, an integration test involving IBEC, ROBOTNIK and SADAKO was designed to test IBEC’s central task planner:

Figure 4: IBEC-ROBOTNIK-SADAKO integration test. Sadako’s vision software output is visible on the monitor on the top left of the image

The test was designed so that the pallet truck had to perform a sequence given by the task planner which involved raising and lowering the pallet in a given sequence. The task planner and interaction manager were configured so that a specific gesture made by the operator and detected by the gesture recognition software would interrupt the pallet truck sequence, and another one would resume it. This test was successful and showed the correct flow of information between the different sensors, software, and robotic actuators.

The latest progress in the HR-Recycler project involves a great deal of effort towards the realization of integration tests at the pilot sites. The results gathered from these preliminary tests are promising and hint at a successful deployment of the complete HR-Recycler pipeline during the following year.

IBEC present an interaction manager to promote reliable HRI

By | Blog | No Comments


Human-Robot Interactions (HRI) have a leading role in hybrid human-robot collaborative environments (i.e recycling plants for Waste Electrical and Electronic Equipment, WEEE. See, where qualified human workers need to interact with their robotic co-workers reliably and effectively. However, in this type of hybrid environment, HRI becomes challenging given the complexity of the agents involved, the work to be done, and the environment where the interactions occur.

To assure effective and safe interactions in the context of HR-Recycler, at IBEC ( we are developing a control module able to orchestrate bidirectional interactions between human workers and their robotic counterparts. For this purpose, an Interaction Manager has been designed to inform the human worker about the state of its workplace.

The Interaction Manager is integrated within a ROS infrastructure from which has access to other cognitive modules’ information, such as the Worker Model, the Moral Engine or the Task Planner. Additionally, the Interaction Manager provides a visual interface that allows the humans in the working environment (the recycling plant) to interact with the non-human agents via a tablet device (Figure 1).

Figure 1. Tablet with the HR-Recycler Interaction Manager App displaying information regarding worker’s demographics, moral preferences and status of the Task Planner.

This application visualizes the worker’s information when detected by SADAKO’s computer-based vision modules. Specifically, the worker’s ID, role, language, and trust level is provided by the Worker Model. Moreover, once the worker is identified and the Moral Engine adjusts the robot’s actuation speed and the safety distance, these personalized parameters are shown in the tablet, allowing the worker to better understand the robot’s performance.

The workflow status of the a-cell has been divided into two disassembling parts allowing the agents involved to work in parallel. In the “Disassembling Process A”, the robot’s workflow state is displayed by querying the Task Planner current status. Moreover, a progress bar reports how far is the robot to complete its disassembling process. The “Disassembling Process B” panel will enable the worker to notify when the device has been fully disassembled.

Importantly, the HR-Recycler’s Interaction Manager anticipates that human-robot interactions will not always occur in trouble-free situations. Given the complexity of disassembling tasks that robots must perform, some additional difficulties may arise, which will be notified by an alert panel reporting the error code and its source (Figure 2). Then it is time for the human worker to decide how the problem will be solved. To that end, relevant options are displayed and the selected solution will be sent back to the Task Planner.

Figure 2. Task Planner alert message providing information of the problem source and offering options to proceed.

Another source of exceptions in the normal workflow is the violation of personalized safety measures. In the case that the human worker surpasses the Human-Robot safety distance, another alert panel is dedicated to reporting such an event (Figure 3). In this case, the robot normal functioning will not be resumed until the current distance to the robot exceeds the safety threshold. Both distances are displayed in real-time.

Figure 3. Moral Engine test at Sadako’s WEEE reproduced recycling plant. The moral alert message reports that the safety threshold has been crossed.

Future developments on the Interaction Manager visual interface will include the automatic personalization of language settings and feedback provision from other Interaction Manager functionalities, such as the recognition of gesture-based HRI. Altogether, this Interaction Manager application and its tablet interface will serve as an additional communication channel ensuring reliable and effective Human-Robot Interaction, considering real-time environmental demands and being adaptive to each worker’s preferences.

COMAU describe new tools for HRC

By | Blog | No Comments

New tools for human robot collaboration

In the context of HR-recycler project the collaboration between human and robot is a crucial aspect allowing a safe and dynamic sharing of the operational area. Due to peculiar environment conditions of the recycling factory that include presence of dust, dirty surfaces, and a not completely structured setup, the use of a robust and flexible device for the safety detection and monitoring in the classification and disassembling areas is required.

After an extended investigation of the several options present on the market, a specific solution based on radar technology by the company Inxpect has been selected. This product presents interesting features that fit quite well with the requirements of recycling factory: resistance to environmental disturbances, high sensitivity, 3D safety monitoring, several configurable fields of detection.

The radar (Radio Detecting and Ranging) is a well-known detection system that uses radio waves for detection and localization. It can measure the distance, angle, or velocity of objects and it consists of a transmitter, a receiver and a processor to carry out the low-level computations. Radio waves (pulsed or continuous) emitted from the transmitter reflect off the object and return to the receiver, giving information about the object location and speed. Electromagnetic waves are extremely resistant to disturbances like dust, debris, smoke or darkness, and they travel close to the speed of light (300,000 km/s).

Figure 1: safety radar sensor and its own controller

For the HR-Recycler projects after some preliminary feasibility studies, COMAU together with the sensors’ supplier and the involved partners opted for a solution that includes six different radar sensors placed in specific spots of the cell. These are arranged to cover all the possible interference area between human and machine, in particular monitoring the dangerous area where an industrial and not-collaborative robot (COMAU NJ 220) is used in the classification process for the pick and place of WEEE components.

The safety signals from the sensor are collected in a specific controller that send the output to a safe PLC. The robot controller is in turn connected to the safe PLC trough safe signals allowing an overall safe management of the cell. When the operator accesses a safeguarded area, the sensor triggers the movement and the robot is stopped immediately; moreover, until the presence is detected inside the specific area (even micro movements as breathing can be detected), the system is prevented from restarting, avoiding in this way any dangerous condition for the worker.

The two different monitoring states of the sensors, namely access and presence of the worker to/in the dangerous area, are showed in the following pictures:

Figure 2: access monitoring of HR-recycler robot station

If the worker enters inside the working area the safety radar sensors detect its movement (figure 2) and consequently stop the robot; after the access, the sensors change their configuration (figure 3) and prevent the restart of the system until the worker is inside the area.

Figure 3: presence monitoring of HR-recycler robot station

The complete cell will be setup and tested in the following months in COMAU premises, where a replica of the real recycling cell has been arranged for testing purposes. After the assessment of the overall solution the entire cell will be moved and deployed in the end user facilities for the final evaluation of the industrial process.

ECORESET present the preliminary pilot setup taken place at its premises

By | Blog | No Comments

Preliminary pilot setup

During the period 14-22/12/2021, the preliminary pilot setup in the frame of the HR Recycler Project took place in the WEEE recycling plant of Ecoreset, with the participation of CERTH, SADAKO, ROBOTNIC, GAIKER and ECORESET.

Remarkable progress has been achieved regarding among others:

  • The completion of the construction of the working benches
  • The camera set up and recording
  • The pallet truck initial set up and localization
  • Installation of the interaction manager with real-time gesture detection

The completion of the set-up will continue early next year in the plant of Ecoreset. The first pilot run is expected within Q1 2022.

CERTH share preliminary results on screw detection and localization in 3D Space

By | Blog | No Comments

Preliminary Results on Screw Detection and localization in 3D Space

The purpose of this task was to:

  • Check the depth accuracy of the 3D sensors (Realsense Modules)
  • Estimate the error derived from the transformations that being occurred in order to convert the object detections from 2D to 3D coordinations
  • Evaluate the impact of the perspective transformations (camera to robot base and inverse).

The following setup has been created in our lab (Figure 1).

  • An aruco marker that represents the robot base has been put on the lab’s table. “Aruco point” refers to its center point.
  • For precision purposes, 7mm diameter screws with 4mm cross in the center have been printed (Point 1-6). This is the real size of the screws on the microwave oven.

Real World distances.

  • Grid paper has been used as a guide for measuring the distances between every point. Moreover, a hand meter and a ruler were used for confirmation purposes. At this stage, all the measurements have been calculated with respect to the camera location.

Ros Calculated distances.

  • Using the mouse pointer in the RGB frame of the official realsense viewer, the center (Pixel 2D coordinates) of each screw (point) has been found. Those points have been converted to 3D coordinates using realsense deproject function and taking into account the intrinsic parameters of the camera. Based on the 3D coordinates the distance between each point has been calculated using the following formula:

A sample chart of the differences between Real World and Ros Calculated distances for point2 is figured above.

  • For visualization purposes, PointStamped ros messages have been created for each point using the aforementioned 3D coordinates. Figure 2 illustrates the pointStamped messages derived from the above procedure.

Robot Base perspective transformation

  • The same procedure and lab setup has been followed for this approach too. The only difference is that every coordinate in this approach is transformed with respect to the Aruco marker (robot base). This procedure has been used to estimate the perspective transformation error. Figure 3 shows the transformed pointStamped messages in Rviz.

Additional Notes and Conclusion

  • Supplementary charts that adequately demonstrate important relationships or patterns between the data points can be provided upon request. Those charts mainly highlight the offset between each point.
  • In general the procedure that has been followed is error prone in mm due to the hand calculated measurements and the hand picked pixel values. However, the results seem to be quite good.
  • A real scenario of the microwave oven that will be used in ECORESET during the 1st pilot of the project has been examined too. Screws in this approach have been detected in 2D using the enhanced screw detector module that has been developed by CERTH. The middle (cross middle) pixel of each screw has been projected to 3D space. A pointStamped message has been created and visualized in rviz alongside with the scene’s point cloud. Using again the attached .rviz configuration you can easily observe the precision of the approach that has been followed in our lab. Figure 5 offer a screenshot of this visualization.