TFG

User tracking and haptic interaction for robot-assisted dressing

Work default illustration

Supervisor/s

Information

  • Started: 10/02/2017
  • Finished: 19/07/2017

Description

Service robots need to be able to adapt to user needs, and teaching the robot new tasks is usually done through close human-robot interaction. This project will explore two interaction modalities, namely user visual tracking and physical interaction. The goal is to develop a framework that will integrate these two modalities in order to successfully recognize user intentions. The developed algorithms will be tested in an assistive-dressing scenario, in which the robot will assist the user to put on a jacket or coat.

The work is under the scope of the following projects:

  • I-DRESS: Assistive interactive robotic system for support in dressing (web)