TFG

Development of a learning from demonstration environment using ZED 2i and HTC Vive Pro

Work default illustration

Student/s

Supervisor/s

Information

  • Started: 01/03/2022
  • Finished: 13/07/2022

Description

Being able to teach complex capabilities, such as folding garments, to a bi-manual robot is a very challenging task, which is often tackled using learning from demonstration datasets. The few garment folding datasets available nowadays to the robotics research community are either gathered from human demonstrations or generated through simulation. The former have the huge problem of perceiving human action and transferring it to the dynamic control of the robot, while the latter requires coding human motion into the simulator in open loop, resulting in far-from-realistic movements.

In this thesis, a novel virtual reality (VR) framework is proposed, based on Unity’s 3D platform and the use of HTC Vive Pro system, ZED mini, and ZED 2i cameras, and Leap motion’s hand-tracking module.

The framework is capable of detecting and tracking objects, animals, and human bodies in a 3D environment. Moreover, the framework is also capable of simulating very realistic garments while allowing users to interact with them, in real-time, either through handheld controllers or the user’s real hands. By doing so, and thanks to the immersive experience, the framework gets rid of the gap between the human and robot perception-action loop, while simplifying data capture and resulting in more realistic samples.

Finally, using the developed framework, a novel garment manipulation dataset will be recorded, containing samples with data and videos of nineteen different types of manipulation which aim to help tasks related to robot learning by demonstration.

The work is under the scope of the following projects:

  • CLOTHILDE: Cloth manipulation learning from demonstration (web)
  • BURG: Benchmarks for Understanding Grasping (web)