Sadako shares the integration results reached during tests done in late October 2021

By | Blog | No Comments

During the past year, the HR-Recycler project has moved to the integration phase, where all the technologies developed in the early phases are implemented in real world scenarios, giving a good assessment of the progress made or still needed to reach the goals defined in the project proposal.

In this blog post, Sadako shares the integration results reached during tests done in late October 2021 in partnership with COMAU as well as the latest updates on the pilot integration tasks done in ECORESET in December 2021.

1.    Integration tests with COMAU

Throughout 2021, Sadako has perfected a Neural Network model able to detect the different type of WEEE objects encountered in the HR-Recycler project and integrated it in a real-time software specifically designed for the Classification cell. The software is able to detect and locate the objects in 3D space with a custom point cloud processing algorithm.

This software has been tested and fine-tuned during integration tests made continuously during all of 2021, with the last integration test made at COMAU’s premises in late October. This test was the occasion to measure the accuracy of the detection software as well as the capabilities of the NJ-220 robotic arm and gripper in the Classification cell scenario, as illustrated in the following sequence:

Figure 1: Right sequence: object being picked up as seen by the vision camera. Left Sequence: General view of the object being picked up

The tests were successful, as 90% of the objects used for the test were correctly identified, located, grasped, and moved to their target location. These results give a high confidence that the prototype will reach the desired performance when deployed at the ECORESET pilot site.

2.    ECORESET pilots

In December 2021, CERTH, IBEC, ROBOTNIK and SADAKO travelled to the ECORESET premises in Athens to perform a first phase of the pilot integration. During this visit, the camera framework designed by Sadako for the Disassembly workbench was installed alongside the workbench designed and built by ECORESET. This camera framework is designed to accommodate the cameras for operator gesture detection as well as the ones for object detection for cobot disassembly tasks, as shown in the following images:

Figure 2: Left image: constructed camera framework over the disassembly workbench. Right image: CAD view of the designed camera framework

This first integration step was an opportunity of successfully testing the operator location detection and the gesture detection software in their final environment, as well as for other the partners present during this visit to test their latest software and hardware developments:

Figure 3: HR Recycler partners working during the Ecoreset pilots

In particular, an integration test involving IBEC, ROBOTNIK and SADAKO was designed to test IBEC’s central task planner:

Figure 4: IBEC-ROBOTNIK-SADAKO integration test. Sadako’s vision software output is visible on the monitor on the top left of the image

The test was designed so that the pallet truck had to perform a sequence given by the task planner which involved raising and lowering the pallet in a given sequence. The task planner and interaction manager were configured so that a specific gesture made by the operator and detected by the gesture recognition software would interrupt the pallet truck sequence, and another one would resume it. This test was successful and showed the correct flow of information between the different sensors, software, and robotic actuators.

The latest progress in the HR-Recycler project involves a great deal of effort towards the realization of integration tests at the pilot sites. The results gathered from these preliminary tests are promising and hint at a successful deployment of the complete HR-Recycler pipeline during the following year.

IBEC present an interaction manager to promote reliable HRI

By | Blog | No Comments


Human-Robot Interactions (HRI) have a leading role in hybrid human-robot collaborative environments (i.e recycling plants for Waste Electrical and Electronic Equipment, WEEE. See, where qualified human workers need to interact with their robotic co-workers reliably and effectively. However, in this type of hybrid environment, HRI becomes challenging given the complexity of the agents involved, the work to be done, and the environment where the interactions occur.

To assure effective and safe interactions in the context of HR-Recycler, at IBEC ( we are developing a control module able to orchestrate bidirectional interactions between human workers and their robotic counterparts. For this purpose, an Interaction Manager has been designed to inform the human worker about the state of its workplace.

The Interaction Manager is integrated within a ROS infrastructure from which has access to other cognitive modules’ information, such as the Worker Model, the Moral Engine or the Task Planner. Additionally, the Interaction Manager provides a visual interface that allows the humans in the working environment (the recycling plant) to interact with the non-human agents via a tablet device (Figure 1).

Figure 1. Tablet with the HR-Recycler Interaction Manager App displaying information regarding worker’s demographics, moral preferences and status of the Task Planner.

This application visualizes the worker’s information when detected by SADAKO’s computer-based vision modules. Specifically, the worker’s ID, role, language, and trust level is provided by the Worker Model. Moreover, once the worker is identified and the Moral Engine adjusts the robot’s actuation speed and the safety distance, these personalized parameters are shown in the tablet, allowing the worker to better understand the robot’s performance.

The workflow status of the a-cell has been divided into two disassembling parts allowing the agents involved to work in parallel. In the “Disassembling Process A”, the robot’s workflow state is displayed by querying the Task Planner current status. Moreover, a progress bar reports how far is the robot to complete its disassembling process. The “Disassembling Process B” panel will enable the worker to notify when the device has been fully disassembled.

Importantly, the HR-Recycler’s Interaction Manager anticipates that human-robot interactions will not always occur in trouble-free situations. Given the complexity of disassembling tasks that robots must perform, some additional difficulties may arise, which will be notified by an alert panel reporting the error code and its source (Figure 2). Then it is time for the human worker to decide how the problem will be solved. To that end, relevant options are displayed and the selected solution will be sent back to the Task Planner.

Figure 2. Task Planner alert message providing information of the problem source and offering options to proceed.

Another source of exceptions in the normal workflow is the violation of personalized safety measures. In the case that the human worker surpasses the Human-Robot safety distance, another alert panel is dedicated to reporting such an event (Figure 3). In this case, the robot normal functioning will not be resumed until the current distance to the robot exceeds the safety threshold. Both distances are displayed in real-time.

Figure 3. Moral Engine test at Sadako’s WEEE reproduced recycling plant. The moral alert message reports that the safety threshold has been crossed.

Future developments on the Interaction Manager visual interface will include the automatic personalization of language settings and feedback provision from other Interaction Manager functionalities, such as the recognition of gesture-based HRI. Altogether, this Interaction Manager application and its tablet interface will serve as an additional communication channel ensuring reliable and effective Human-Robot Interaction, considering real-time environmental demands and being adaptive to each worker’s preferences.

COMAU describe new tools for HRC

By | Blog | No Comments

New tools for human robot collaboration

In the context of HR-recycler project the collaboration between human and robot is a crucial aspect allowing a safe and dynamic sharing of the operational area. Due to peculiar environment conditions of the recycling factory that include presence of dust, dirty surfaces, and a not completely structured setup, the use of a robust and flexible device for the safety detection and monitoring in the classification and disassembling areas is required.

After an extended investigation of the several options present on the market, a specific solution based on radar technology by the company Inxpect has been selected. This product presents interesting features that fit quite well with the requirements of recycling factory: resistance to environmental disturbances, high sensitivity, 3D safety monitoring, several configurable fields of detection.

The radar (Radio Detecting and Ranging) is a well-known detection system that uses radio waves for detection and localization. It can measure the distance, angle, or velocity of objects and it consists of a transmitter, a receiver and a processor to carry out the low-level computations. Radio waves (pulsed or continuous) emitted from the transmitter reflect off the object and return to the receiver, giving information about the object location and speed. Electromagnetic waves are extremely resistant to disturbances like dust, debris, smoke or darkness, and they travel close to the speed of light (300,000 km/s).

Figure 1: safety radar sensor and its own controller

For the HR-Recycler projects after some preliminary feasibility studies, COMAU together with the sensors’ supplier and the involved partners opted for a solution that includes six different radar sensors placed in specific spots of the cell. These are arranged to cover all the possible interference area between human and machine, in particular monitoring the dangerous area where an industrial and not-collaborative robot (COMAU NJ 220) is used in the classification process for the pick and place of WEEE components.

The safety signals from the sensor are collected in a specific controller that send the output to a safe PLC. The robot controller is in turn connected to the safe PLC trough safe signals allowing an overall safe management of the cell. When the operator accesses a safeguarded area, the sensor triggers the movement and the robot is stopped immediately; moreover, until the presence is detected inside the specific area (even micro movements as breathing can be detected), the system is prevented from restarting, avoiding in this way any dangerous condition for the worker.

The two different monitoring states of the sensors, namely access and presence of the worker to/in the dangerous area, are showed in the following pictures:

Figure 2: access monitoring of HR-recycler robot station

If the worker enters inside the working area the safety radar sensors detect its movement (figure 2) and consequently stop the robot; after the access, the sensors change their configuration (figure 3) and prevent the restart of the system until the worker is inside the area.

Figure 3: presence monitoring of HR-recycler robot station

The complete cell will be setup and tested in the following months in COMAU premises, where a replica of the real recycling cell has been arranged for testing purposes. After the assessment of the overall solution the entire cell will be moved and deployed in the end user facilities for the final evaluation of the industrial process.

ECORESET present the preliminary pilot setup taken place at its premises

By | Blog | No Comments

Preliminary pilot setup

During the period 14-22/12/2021, the preliminary pilot setup in the frame of the HR Recycler Project took place in the WEEE recycling plant of Ecoreset, with the participation of CERTH, SADAKO, ROBOTNIC, GAIKER and ECORESET.

Remarkable progress has been achieved regarding among others:

  • The completion of the construction of the working benches
  • The camera set up and recording
  • The pallet truck initial set up and localization
  • Installation of the interaction manager with real-time gesture detection

The completion of the set-up will continue early next year in the plant of Ecoreset. The first pilot run is expected within Q1 2022.

CERTH share preliminary results on screw detection and localization in 3D Space

By | Blog | No Comments

Preliminary Results on Screw Detection and localization in 3D Space

The purpose of this task was to:

  • Check the depth accuracy of the 3D sensors (Realsense Modules)
  • Estimate the error derived from the transformations that being occurred in order to convert the object detections from 2D to 3D coordinations
  • Evaluate the impact of the perspective transformations (camera to robot base and inverse).

The following setup has been created in our lab (Figure 1).

  • An aruco marker that represents the robot base has been put on the lab’s table. “Aruco point” refers to its center point.
  • For precision purposes, 7mm diameter screws with 4mm cross in the center have been printed (Point 1-6). This is the real size of the screws on the microwave oven.

Real World distances.

  • Grid paper has been used as a guide for measuring the distances between every point. Moreover, a hand meter and a ruler were used for confirmation purposes. At this stage, all the measurements have been calculated with respect to the camera location.

Ros Calculated distances.

  • Using the mouse pointer in the RGB frame of the official realsense viewer, the center (Pixel 2D coordinates) of each screw (point) has been found. Those points have been converted to 3D coordinates using realsense deproject function and taking into account the intrinsic parameters of the camera. Based on the 3D coordinates the distance between each point has been calculated using the following formula:

A sample chart of the differences between Real World and Ros Calculated distances for point2 is figured above.

  • For visualization purposes, PointStamped ros messages have been created for each point using the aforementioned 3D coordinates. Figure 2 illustrates the pointStamped messages derived from the above procedure.

Robot Base perspective transformation

  • The same procedure and lab setup has been followed for this approach too. The only difference is that every coordinate in this approach is transformed with respect to the Aruco marker (robot base). This procedure has been used to estimate the perspective transformation error. Figure 3 shows the transformed pointStamped messages in Rviz.

Additional Notes and Conclusion

  • Supplementary charts that adequately demonstrate important relationships or patterns between the data points can be provided upon request. Those charts mainly highlight the offset between each point.
  • In general the procedure that has been followed is error prone in mm due to the hand calculated measurements and the hand picked pixel values. However, the results seem to be quite good.
  • A real scenario of the microwave oven that will be used in ECORESET during the 1st pilot of the project has been examined too. Screws in this approach have been detected in 2D using the enhanced screw detector module that has been developed by CERTH. The middle (cross middle) pixel of each screw has been projected to 3D space. A pointStamped message has been created and visualized in rviz alongside with the scene’s point cloud. Using again the attached .rviz configuration you can easily observe the precision of the approach that has been followed in our lab. Figure 5 offer a screenshot of this visualization.

CS GROUP elaborate on their developed virtual training system for the WEEE recycling industry

By | Blog | No Comments

The ambitious H2020 HR-Recycler project answers the need to create a new human-robot collaborative environment, where WEEE can be disassembled in a safe manner. To achieve an efficient collaboration in this field 12 European partners have joint forces and are working on various elements required create the hybrid human-robot recycling plant for electrical and electronic equipment of the future.

Within HR-Recycler CS GROUP – France (formerly DIGINEXT) focuses on the development of an effective virtual training system for the WEEE recycling industry. The dedicated tool (Procedure Editor) enables the creation of WEEE disassembly training experiences, based on predefined procedures (see image below), as described in a previous blog post.

Once the procedures created, they can either be implemented on the relevant object or exported in a format useable for VR training of robots and human workers. Taking this into consideration, CS GROUP has therefore implemented an S1000D export, described in more details below.

S1000D is an international specification for the procurement and production of technical publications. It is an XML specification for preparing, managing, and publishing technical information for a product.

S1000D is not a one-size-fits-all solution – it is a many-sizes-fit-many solution; through a combination of business rules, selectable elements and customizable values the standard is tailored to meet the project requirements.

Source: Wikipedia

This type of export can be performed by anyone and results, within HR-Recycler, in two different files, as illustrated in the figure below. The content of each file is as follows:

  • File 1: S1000D xml – containing the required persons, required parts and the procedure.
  • File 2: 3D data xml – containing 3D file URLs, submodel URLs for each part and animation data

Below are included a few specific examples on the level of information provided for each element.

The first image shows the list of people, required to perform a certain procedure along with their person category code. Consequently, the extracted file provides clear information on the type of personnel and competences required to perform a certain task.

Furthermore, in the S1000D Export file it is possible to see the sequence of steps to be performed in order to complete a specific procedure. For example, it shows the:

  • type of personnel/operator required to perform a procedure (highlighted in blue),
  • the action to be performed, e.g. “unscrew” (highlighted in red)
  • the reference of the tool and the associated video sequence (highlighted in green)

For training purposes, it is useful to be able to visualise the different steps of the procedures, that might be complex. The figure below shows the list of animations to be played for each step of the procedure.

This approach has made it possible to easily share information on procedures with relevant partners and enable the VR training of robots and human workers.

CERTH elaborate on AR-Enabled Safety and Human-Robot Collaboration within HR-Recycler

By | Blog | No Comments

AR-Enabled Safety and Human-Robot Collaboration in HR-Recycler

CERTH has developed an AR-enabled system to monitor and communicate with robots in the WEEE recycling environment, through the use of optical and projective AR techniques. Such a system is essential in the harsh and noisy industrial WEEE recycling environment, where noise makes vocal communication difficult, and the scale and clutter of the environment pose challenges to achieving overview of the overall process for workers and supervising personnel.

The AR system introduced helps overcome the above challenges and introduce additional safety aspects to the overall process. At the most fundamental level, projective AR is used to indicate to workers the workspaces and intentions of the robots. For instance, the Autonomous Ground Vehicles (AGV) in HR-Recycler include projectors that highlight on the ground the immediate trajectory of the robot, and luminous strips that indicate the intention of the robot to move towards the left or right direction. Similarly, the workstations are equipped with projectors that indicate the workspace of the collaborative robots, as well as current targets for the robots to drop off components extracted from the devices. Projective AR is a ubiquitous technique that does not require the user to wear any equipment and provides an additional degree of safety in the shop floor.

In addition to projective AR, HR-Recycler has developed an optical AR system, where advanced robot supervision and communication is made possible through the use of an AR Head-Mounted Display (HMD). Through the HMD, the user is able to observe the current state and the plan of the robot or send goals for pick-and-place. The navigation plan and planned arm trajectory are visualized in an intuitive manner, overlaid on top of the actual robot, allowing the user to adjust their actions in accordance to the current robot plan. In addition, through registration of the HMD with the robot reference frame, it is possible to detect whether the user has entered the workspace of the robot, and in this case the robot is paused and a warning is presented to the user.

Overall the developed AR system increases safety within the shop floor and improves the overview of the workers and supervisors, thereby improving overall process efficiency.

TECNALIA describe the challenge of measuring emotions

By | Blog | No Comments

The challenge of measuring emotions

Action-reaction, but how to measure it if it is an emotion?

The central nervous system plays a fundamental role in detecting and understanding emotions, cognitive processes and a series of other psychosocial constructs such as trust. The responses of the nervous system are relatively specific and show different patterns of activation depending on the situations and the emotional state. Any psychological process entails an emotional experience of greater or lesser intensity and of different quality, which is why the emotional reaction is something omnipresent in every psychological process. Likewise, all emotion involves a multidimensional experience with at least three response systems: cognitive (subjective), behavioural (expressive) and physiological (adaptive). Analysing, for example, the different dimensions of anger, it is possible to distinguish between the cognitive process (feeling angry), the behavioural (frowning) and the physiological (heart rate variation, among others).

Based on this premise, there are a whole series of psychophysiological signals that have been used to identify the various emotional processes:

  • Electrical activity of the brain: By means of an electroencephalogram (EEG) it is possible to capture the cortical activity of the brain using electrodes in contact with the scalp. These types of signals offer a lot of information, although their analysis can be very complex since each component (electrode) contains information of a temporal nature (its amplitude), modal (frequency of the wave) and topographic (location in the brain). Although traditionally the acquisition of this signal required precise and specialized instruments, today the necessary equipment is much more accessible.
  • Cerebral blood flow: Functional magnetic resonance imaging (fMRI) is a non-invasive technique that allows the measurement of brain activity. This technique is based on detecting areas of the brain with a higher concentration of blood flow, under the premise that these specific areas have greater activity compared to others with a lower flow. Although it offers great advantages at the research level, it requires bulky equipment and very high cost.
  • Variation in heart rate: It is one of the main physiological signals linked to the feeling of security or danger. At the level of analysis, the period between beats is usually considered, distinguishing between low frequency pulsations if this period is below a threshold (classically 0.15 Hz) or high frequency if said threshold is exceeded. The main techniques to obtain this metric are the electrocardiogram (ECG), which records the activity of the heart, and photoplethysmography (PPG), which uses a controlled light beam to calculate blood flow based on the amount of reflected light.
  • Skin galvanic response (GSR): Also called electrodermal activity, electrodermal activity or EDA. It reflects the electrical conductance of the skin by measuring the potential difference generated between two electrodes, generally located on the phalanges of two continuous fingers of the non-dominant hand. The excitation of the skin, normally produced in stressful situations, causes a dilation of the pores which, in turn, causes a decrease in the electrical resistance of the skin. It is necessary to distinguish its tonic or basal component and its phasic component. The former undergoes slow variations and often reflects unwanted experimental changes (for example, changes in the temperature conditions of the experiment), while the latter reflects rapid changes that are linked to the study stimulus.
  • Muscle activity: Muscle activity can be measured using a pair of electrodes aligned with their kinematic axis (direction in which they expand and contract). The application can be used in a variety of situations. For example, electrodes can be positioned on both sides of the eyes (either vertically or horizontally) to detect eye movements. This specific test, called electrooculography (EOG) offers very useful information when cleaning the encephalographic signal of possible unwanted artifacts. Other generic applications include the detection of involuntary movements using electromyograms (EMG) in response to certain stimuli.
  • Ocular behaviour: Ocular behaviour offers very valuable information on certain aspects, both physiological (for example, dilation and contraction of the pupil, blinking frequency, etc.) and behavioural (for example, drift and gaze fixation). Depending on whether the experimental process is carried out on a fixed platform (such as a computer) or freely available (participant in movement), this information can be extracted using fixed eye tracking instruments (attached to the visualization system) or mobiles (wearable systems in the form of glasses).

In short, analysing and measuring our nervous system to study or even understand our reactions, our emotions, to certain stimuli or situations is much more than a technological challenge. Designing and correctly using elements or devices that analyse our body and its signals is a challenge for science, for technology and for its possible multisectoral applications where the basis is to know the individual and their possible behaviours.

European Union (EU) challenges on artificial intelligence (AI) impact assessment

By | Blog | No Comments

European Union (EU) challenges on artificial intelligence (AI) impact assessment

Nikolaos Ioannidis[1]

Vrije Universiteit Brussel (VUB)


The European Commission’s (EC) Proposal for a Regulation laying down harmonized rules on artificial intelligence (AI Act) has drawn extensive attention as being the ‘first ever legal framework on AI, resuming last year’s discussions on the AI White Paper. The aim of the Proposal and the mechanisms it encompasses is the development of an ecosystem of trust through the establishment of a human-centric legal framework for trustworthy AI.

The ecosystem aims to establish a framework for a legally and ethically trustworthy AI, promoting socially valuable AI development, ensuring and respecting fundamental rights, the rule of law, and democracy, allocating and distributing responsibility for wrongs and harms and ensuring meaningful transparency and accountability. Any AI application, including the HR-Recycler project and its associated technology, is expected to be subject to specific legal requirements set by this framework.

For individuals to trust that AI-based products are developed and used in a safe and compliant manner and for businesses to embrace and invest in such technologies, a series of novelties have been introduced in the proposed Act. Those novelties include, but are not limited to, i) the ranking of AI systems, depending on the level of risk stemming from them (unacceptable, high, low, and minimal), as well as ii) the legal requirements for high-risk AI systems.

Risk-based approach

AI applications are categorized based on the estimated risk that they may generate. Accordingly, there are various levels of risk – unacceptable, high, limited, minimal. AI applications of unacceptable risk – and thus prohibited – comprise those which, for instance, deploy subliminal techniques beyond a person’s consciousness in order to materially distort a person’s behaviour, causing physical or psychological harm or systems whose function is the evaluation or classification of the trustworthiness of natural persons based on social behaviour or known or predicted personal or personality characteristics, leading to detrimental or unfavourable treatment, inter alia.

High-risk AI applications are the most problematic because they are permitted to be deployed, under specific circumstances. In this category fall applications, which pertain to biometric identification and categorization of natural persons, management and operation of critical infrastructure, education and vocational training, employment, workers management and access to self-employment, access to and enjoyment of essential private services and public services and benefits, law enforcement, migration, asylum and border control management and administration of justice and democratic processes. The research of HR-Recycler could fall under the scope of the safety-critical domain, in which safety components and machinery are used. For these, the major subsequent obligation is the conformity assessment procedure.

Conformity assessment

The conformity assessment procedure for high-risk AI applications is tied with several legal requirements, which can be summarized as follows: i) introduction of a risk management system (a step forward compared to the data protection impact assessment process (DPIA)), ii) setting up a data governance framework (training, validation and testing data sets), iii) keeping technical documentation (ex-ante and continuous), iv) ensuring record keeping (automatic recording of events – ‘logs’), v) enabling transparency and provision of information (interpretation of the system’s output), vi) ensuring human oversight (effectively overseen by natural persons) and vii) guaranteeing accuracy, robustness and cybersecurity (consistent performance throughout the AI lifecycle).

Challenges and questionable areas

However, it is still not clear to which extent and how to carry the ex-ante conformity assessment, bringing to the fore novel challenges; the proposal itself generates multiple points of debate as per its applicability and interpretation of its provisions, some listed below:

  • Definition of AI (being incredibly broad, comprising virtually all computational techniques)
  • Complex accountability framework (introduction of new stakeholders and roles, cf. GDPR)
  • Legal requirements (clarification on data governance, transparency and human oversight)
  • Risk categorization (fuzzy and simplistic, lack of guidance on necessity and proportionality)
  • Types of harms protected (excluded financial, economic, cultural, societal harms etc.)
  • Manipulative or subliminal AI (being an evolutive notion)
  • Biometric categorization and emotion recognition systems (disputed and debatable impact)
  • Self-assessment regime (lack of legal certainty and effective enforcement)
  • Outsourcing the discourse on fundamental rights (large discretionary power for private actors)
  • Independence of private actors (in need of strengthened ex-ante controls)
  • Lack of established methodology (a new kind of impact assessment, cf. DPIA)
  • Checklist attitude (binary responses to questionnaires)
  • Technocratic approach towards fundamental rights (against their spirit)
  • Standards-setting (role of incumbent organizations such as CEN / CENELEC)
  • Relation with other laws (interplay with the GDPR and LED?)
  • Multidisciplinarity (miscommunication among internal stakeholders)
  • External stakeholder participation (insufficient engagement of them)
  • Societal acceptance of AI applications (scepticism, mistrust, disbelief)

Given the fact that the AI Act proposal is expected to pass a long negotiation phase, in which the European Commission, the European Parliament and the Council will articulate their opinion and reservations, along with numerous feedback submissions by research and policy actors, the final text may differ from the current formulation, hopefully addressing and clarifying the majority of legal gaps identified until this point.

[1] E-mail: