Daily Archives

30/11/2020

IBEC presents a gesture-based communication protocol to enhance human-robot collaboration

By | Blog | No Comments

SPECS-lab

28/11/2020

Enhancing human-robot collaboration in industrial environments via a gesture-based communication protocol.

One of the core principles behind the HR-Recycler project is to develop new ways of enhancing human-robot collaboration in industrial environments. A key requirement for establishing an efficient collaboration between humans and robots is the development of a concrete communication protocol. In particular, human workers require a fast way to communicate with the robot, as well as a mechanism to control it in case it is necessary.

An obvious choice would be to issue voice commands. However, the context of the industrial recycling plant in which the human-robot interaction will take place is noisy, and any communication protocol that relies on sound will face many problems. A communication protocol based on facial recognition is also discarded since the workers need to wear protective masks in these contexts. Using a set of buttons or a tablet to send information to the robot can be a good solution. However, these mechanisms cannot be the only channel of communication since the worker needs a fast and intuitive communication channel on which he can resort even when he is at the workbench or handling some tool.

SADAKO has built a replica of a workbench in their premises in which the gesture recognition was tested.

In order to achieve that, we have developed a non-verbal communication protocol based on gestures that will serve as an input for the robots. We have identified the following messages where gestures will be employed: start (or resume a previously paused action), pause (current action), cancel (current action), sleep (robot goes to idle mode), point (directional point to focus attention), wave (hello), no, yes.

Ismael performing the Vulcan salutation gesture that means “Live long and prosper”. Probably the most popular way to say “hi” be tween two rational agents (or at least the coolest, according to science).

To choose which gestures will represent each of the communicated messages, we need to consider two things: gestures should be easy to remember (so we do not suggest highly complex gestures) but they should not be gestures habitually used by humans. Also, gestures should be easy to remember, so human workers will not need training sessions for remembering the gestures. However, they should not be too simple or gestures that are largely employed by humans during communication to avoid situations where humans spontaneously perform those gestures, for example, while interacting with a co-worker.

Upon detection of the “hammering” action, the robot signals a blue light. This test served to illustrate that the gesture was correctly identified and received by the robot.

For the technical implementation of such a communication system, two partners of the HR-Recycler consortium have joined efforts. On one side, SADAKO has developed a gesture detection algorithm to correctly identify in real-time each of the communication gestures defined in the protocol. On the other side, IBEC has developed the concrete non-verbal communication repertoire, as well as an interaction-control module that integrates the information coming from SADAKO’s gesture recognition software and transforms it in a specific action command that is issued to the robot.

Óscar performs a “wave” gesture that is correctly identified by the gesture recognition software developed by SADAKO.

The first physical integration session between two partners of the HR-Recycler consortium took place last month at SADAKO’s premises to perform the initial tests of the gesture-based HRI communication protocol. There, a team composed by the IBEC and SADAKO groups tested the real-time detection of several of the proposed gestures. They also verified that the interaction manager was receiving the information of the identified gestures and correctly converting it in the corresponding robot commands.

Robotnik explains how to use MoveIt to develop a robotic manipulation application

By | Blog | No Comments

HOW TO USE MOVEIT TO DEVELOP A ROBOTIC MANIPULATION APPLICATION

European Commission funded HR-Recycler project aims at developing a hybrid human-robot collaborative environment for the disassembly of electronic waste. Humans and robots will be working collaboratively sharing different manipulation tasks. One of these tasks takes place in the disassembly area where electronic components are sorted by type into their corresponding boxes. To speed up the component sorting task Robotnik is developing a mobile robotic manipulator which needs to pick boxes filled with disassembled components from the workbenches and transport them either to their final destination or to further processing areas. MoveIt is an open source robotic manipulation platform which allows you to develop complex manipulation applications using ROS. Here, a brief summary showing how we used MoveIt functionalities to develop a pick and place application will be presented.

Figure 1: Pick and Place task visual description.

We found MoveIt to be very useful in the early stages of developing a robotic manipulation application. It allowed us to decide on the environment setup, whether our robot is capable of performing the manipulation actions we need it to perform in that setup, how to arrange the setup for the best performance, how to design the components of the workspace the robot has to interact with so that they allow for the correct completion of the manipulation actions needed in the application. Workspace layout is very easy to carry out in MoveIt as it allows you to build the environment using mesh objects previously designed in any cad program and allows your robot to interact with them.

One of MoveIt’s strengths is that it allows you to plan towards any goal position not only taking into account the environment scene by avoiding objects, but also interacting with it by grabbing objects and including them in the planification process. Any MoveIt scene Collision Object can be attached to a desired robot link, MoveIt will then allow collisions between that link and the object, once attached the object will move together with the robot’s link.

Figure 2: MoveIt Planning Scene with collision objects (green) and attached objects (purple).

This functionality helped us determine from the very beginning whether our robot arm was able to reach objects in a table with a certain height, how far away from the table should the robot position to reach the objects properly, is there enough space to perform the arm movements it needs to perform to manipulate objects around the workspace area. It also helped us design the boxes needed for the task, allowing us to decide on the correct box size that will allow the robot arm to perform the necessary manipulation movements given the restricted working area.

However MoveIt’s main use is Motion Planning. MoveIt includes various tools that allow you to perform motion planning to a desired pose with high flexibility, you can adjust the motion planning algorithm to your application to obtain the best performance. This is very useful as it allows you to restrict your robot’s allowed motion to fit very specific criteria, which in an application like ours, with a restricted working space where the robot needs to manipulate objects precisely in an environment shared with humans is very important.

Figure 3: Planning to a desired goal taking into account collisions with scene.

One of the biggest motion requirements we have is the need for the robot arm to maintain the boxes parallel to the ground when manipulating them as they will be filled with objects that need to be carried between working stations. This can be easily taken into account with MoveIt as it allows you to plan using constraints. There are various constraints that can be applied, the ones we found more useful for our application are joint constraints and orientation constraints. Orientation constraints allow you to restrict the desired orientation of a robot link, they are very useful to maintain the robot’s end effector parallel to the ground, needed to manipulate the boxes properly. Joint constraints limit the position of a joint to be within a certain bound, they are very useful to shape the way you want your robot to move, in our application it allowed us to move the arm maintaining a relative position between the elbow and shoulder, performing more natural movements and avoiding potentially dangerous motions.

Figure 4: Motion Planning with joint and orientation constraints vs without.

Another useful MoveIt motion planning tool is that it allows you to plan movements to a goal position both in Cartesian and in Joint Space, allowing you to switch between these two options for different desired trajectory outcomes. Cartesian Space planning is used whenever you want to follow a very precise motion with the end effector link. In our application we made use of these functionality when moving down from the box approach position to  the grab position and back again. Our robot has to carry the boxes with it, and due to limited space on his base area all of the boxes are quite close together, using Cartesian planning we could assure we are maintaining verticality while raising the box from its holder avoiding latching between boxes and unnecessary stops. Joint Space planning is however useful to obtain more natural trajectories when the arm is moving between different grabbing positions making movement smoother.

Figure 5: Motion Planning in Cartesian Space vs Joint Space.

This is just a brief summary of how we used MoveIt to develop a preliminary robotic pick and place manipulation application, there are still lots of different tools that MoveIt has to offer. Some of MoveIt’s most advanced applications include integrating 3D sensors to build a perception layer used for object recognition in pick and place tasks or using deep learning algorithms for grasp pose generation, areas that will be explored in the next steps. Stay tuned for future updates in the development of a robotic manipulation application using MoveIt’s latest implementations.

Down below you will find a short demonstration of the currently developed application running on a Robotnik’s RB-KAIROS mobile manipulator.

Component Sorting manipulation application DEMO

https://www.youtube.com/watch?v=JgyDB57xjDw