User attention and intention recognition in human-robot collaborative tasks
- If you are interested in the proposal, please contact with the supervisors.
Some collaborative tasks require close interaction between the robot and the user, and for such tasks robot’s ability to accurately track and recognize user motions is of great importance. This project will focus on the recognition of user attention and intention through gestures as the main interaction modality. The developed algorithms will be tested in a user-guided “pick and place” scenario requiring a robot to pick up different pieces of clothing from a pile and sort them out according to user preferences.
The project will start with an evaluation of existing tools for user tracking and gesture recognition (Kinect skeleton tracking, etc.) and also with a study of relevant literature about gesture-based human-robot interaction.
The expected outcomes of the project are:
- Development of an algorithm for user following associated with the “pick-and-place” task. (User following requires both user motion tracking and robot motion planning.)
- Development of a gesture recognition algorithm and definition of a gesture vocabulary relevant to the "pick-and-place" task.
The project will allow possible extensions depending on the obtained results.
The candidate will have access to the WAM robotic arms and Kinect 1 and Kinect 2 cameras of our Perception and Manipulation Lab.
- Great interest in robotics.
- Good programming skills in C/C++.
- Programming in ROS.
- Previous experience in working with 3D cameras.
The work is under the scope of the following projects:
- I-DRESS: Assistive interactive robotic system for support in dressing (web)