On inferring intentions in shared tasks for industrial collaborative robots

Alberto Olivares-Alarcos, Sergi Foix and Guillem AlenyĆ 

Institut de Robòtica i Informàtica Industrial, CSIC-UPC, C/ Llorens i Artigas 4-6, 08028 Barcelona, Spain.

Abstract - Inferring human operators' actions in shared collaborative tasks, plays a crucial role in enhancing the cognitive capabilities of industrial robots. In all these incipient collaborative robotic applications, humans and robots not only should share space but also forces and the execution of a task. In this article, we present a robotic system which is able to identify different human's intentions and to adapt its behavior consequently, only by means of force data. In order to accomplish this aim, three major contributions are presented: (a) force-based operator's intent recognition, (b) force-based dataset of physical human-robot interaction and (c) validation of the whole system in a scenario inspired by a realistic industrial application. This work is an important step towards a more natural and user-friendly manner of physical human-robot interaction in scenarios where humans and robots collaborate in the accomplishment of a task.

Video

Demo of the experiments done with fifteen users who individually collaborate with a robot on the accomplisment of a shared task: polishing an object.

Dataset

Force-based dataset of physical human-robot interaction. Due to the lack of similar existent datasets, we present a novel dataset containing force-based information extracted from natural human-robot interaction. Geared towards the inference of operator's intentions, the dataset comprises labelled signals from a force sensor. Our aim is to generalise from a few users to several, therefore, our dataset was only recorded with two users. Indeed, this is compliant with industrial environments in which the system should be used by new operators preferably with no need of retraining. The dataset and the code used during the evaluation shown in the article, are available on Github. The data was recorded using a ATI Multi-Axis Force/Torque Sensor Mini40 SI-20-1, which was fastened to the wrist of the robot, the basis of the end effector. It was used the default configuration of the sensor and the measurements were taken at a frequency of 500 Hz. Every measurement or sample contains a single sort of interaction, from the beginning to the end of the physical contact. It is worth mentioning that gathered data samples were not of the same length, ranging from half a second to three seconds long. In the dataset, the shorter ones are padded with zero values at the end of the temporal sequences so that all the samples have the same length. The dataset includes six textual files for each of the human's actions, which correspond to the six degrees of the force/torque sensor. Each row of the file contains the sensor's measurements for one sample. We have released two different datasets: mechanical and natural. In the mechanical dataset, each class follows distinct movement patterns, which produce completely different force signals. Therefore, the samples of each of the intentions/classes are clearly distinguishable from each other. It contains 600 samples. Recall we have three classes and we have used two users, thus, each user did 100 samples of each class. On the contrary, in the natural dataset, the movement patterns between classes are much more similar to each other. Meaning, there is more ambiguity among samples of different classes, which makes classifying more complicated. This dataset contains more samples, 900. Recall we have three classes and we have used two users, thus, each user did 150 samples of each class.

Citation

Olivares-Alarcos, A.; Foix, S.; AlenyĆ , G. On Inferring Intentions in Shared Tasks for Industrial Collaborative Robots. Electronics 2019, 8, 1306.