Publication
Garment manipulation dataset for robot learning by demonstration through a virtual reality framework
Conference Article
Conference
Catalan Conference on Artificial Intelligence (CCIA)
Edition
24th
Pages
199-208
Doc link
http://dx.doi.org/10.3233/FAIA220338
File
Abstract
Being able to teach complex capabilities, such as folding garments, to a bi-manual robot is a very challenging task, which is often tackled using learning from demonstration datasets. The few garment folding datasets available nowadays to the robotics research community are either gathered from human demonstrations or generated through simulation. The former have the huge problem of perceiving human action and transferring it to the dynamic control of the robot, while the latter requires coding human motion into the simulator in open loop, resulting in far-from-realistic movements. In this article, we present a reduced but very accurate dataset of human cloth folding demonstrations. The dataset is collected through a novel virtual reality (VR) framework we propose, based on Unity’s 3D platform and the use of a HTC Vive Pro system. The framework is capable of simulating very realistic garments while allowing users to interact with them, in real time, through handheld controllers. By doing so, and thanks to the immersive experience, our framework gets rid of the gap between the human and robot perception-action loop, while simplifying data capture and resulting in more realistic samples.
Categories
computer vision, edge detection, gesture recognition, image classification, pose estimation.
Author keywords
Learning by demonstration, Virtual reality, Human in the loop, Data acquistion
Scientific reference
A. Boix, S. Foix and C. Torras. Garment manipulation dataset for robot learning by demonstration through a virtual reality framework, 24th Catalan Conference on Artificial Intelligence, 2022, Sitges, in Artificial Intelligence Research and Development, Vol 356 of Frontiers in Artificial Intelligence and Applications, pp. 199-208, IOS Press.
Follow us!