“Workmates of Indumental, Gaiker and Tecnalia interacted with a Mixed Reality environment working with a collaborative robot in WEEE recycling domain”
The romantic idea that emotions are born from the heart has been a curious way of expressing that thoughts and emotions are elements that coexist separately, that is, the brain and the heart seen as unconnected organs.
Today and thanks to Neuroscience (a scientific specialty that is dedicated to the comprehensive study of the nervous system, its functions, structure and other aspects that help explain various characteristics of behaviour and cognitive processes), theories such as “neither emotions they are exclusively of the heart, nor does reason alone reside in the brain” take on special relevance. What Neuroscience shows us is that cognition and emotion are closely linked, they are two sides of the same coin and have their residence in our nervous system.
A group of researchers in Tecnalia have created a laboratory dedicated to studying User Experience (UX) and Human Factors (HF), in order to take into account other emotions as an innovative aspect not considered until now. With this, they intend to provide objectivity to the data in the analysis of interactions and abstract emotional and cognitive processes.
Tecnalia’s Human Factors & User Experience laboratory integrates different devices (dry and wet EEG systems, sensors to measure the galvanic response of the skin and heart rate, eye-tracking glasses, indoor GPS system …) that allow the measurement (accurately and non-invasively) of psycho-physiological signals of a person when exposed to diverse external stimuli. Thus, the UX & HF laboratory goes one step further in the usual process of a user experience study (for example, surveys, focus groups, interviews, thinking aloud, etc.). Instead of asking the user or observing him/her when the person uses a product or service to know how the experience has been, with our laboratory technology we measure what the nervous system tells us by presenting him/her with different stimuli and thus obtaining objective data, avoiding biases derived from the subjective observation of the interviewer, of the subjective assessment of the respondent, and even of their own emotions.
Specifically, Tecnalia researchers have reproduced in a Virtual (Mixed) Reality environment the process of disassembling a PC tower (collaboratively with a robot that helps them in certain tasks), in such a way that the participant in the experiment has actually been able to “feel” that it was physically touching the PC components in the disassembly process. About 50 people from the consortium companies (Indumetal, Gaiker and Tecnalia) have passed through the experiment for two weeks. All of them have been monitored and asked to repeat the process, with slight variations in the behaviour of the collaborative robot (for example, robot malfunction and robot speed).
This Mixed Reality experiment has replicated the layout of a Human-Robot collaborative workstation of a WEEE (Waste Electrical and Electronic Equipment) recycling plant. The participants have been asked to perform the same activities as the workers on the real plant and the virtual robots have interacted with them on the same reality basics. However, during the experiments, sometimes robots will behave differently, and the psychophysiological response of the participant has been registered.
In order to make the experiment participants feel the experience as real as possible, a realistic 3D virtual scenario of a factory was created for viewing through immersive virtual reality glasses, and this virtual scenario was mixed with real tracked elements with which the participants will interact directly in a task of disassembly of electronic components.

The real objects that have been tracked (with small “bubbles”) and are aligned with the virtual reality in real time, are the worktable, the PC (specifically its top cover) that is manipulated and a fan inside the PC.


Once we had achieved this mixed experience (VR scenario plus real tracked objects), the next challenge was to include interactions tracking the hands of the participants. For this purpose, different technological approaches have been tested and compared.
In any case, and after several tests, the experiment with the participants has been carried out with the tracking of the hands/fingers by vision, adding the support of the wrist tracker.
With this experiment, the worlds of Virtual/Mixed reality and Neuroscience have come together, to ensure that in the future our “intelligent” robot mates will be able to adapt to our emotional and cognitive state, thus achieving a more fluid interaction between human and his fellow machines.
