Publication

Tracking cloth deformation: a novel dataset for closing the sim-to-real gap for robotic cloth manipulation learning

Journal Article (2025)

Journal

International Journal of Robotics Research

Doc link

https://doi.org/10.1177/02783649251317617

File

Download the digital copy of the doc pdf document

Abstract

Robotic learning for deformable object manipulation—such as textiles—is often done in simulation due to the current limitation of perception methods to understand cloth’s deformation. For this reason, the robotics community is always on the search for more realistic simulators to reduce as much as possible the sim-to-real gap, which is still quite large especially when dynamic motions are applied. We present a cloth dataset consisting of 120 high-quality recordings of several textiles during dynamic motions. Using a Motion Capture System, we record the location of key-points on the cloth surface of four types of fabrics (cotton, denim, wool and polyester) of two sizes and at different speeds. The scenarios considered are all dynamic and involve rapid shaking and twisting of the textiles, collisions with frictional objects, strong hits with a long and thin rigid object and even self-collisions. We explain in detail the scenarios considered, the collected data and how to read it and use it. In addition, we propose a metric to use the dataset as a benchmark to quantify the sim-to-real gap of any cloth simulator. Finally, we show that the recorded trajectories can be directly executed by a robotic arm, enabling learning by demonstration and other imitation learning techniques.

Categories

computer vision, object recognition, robot vision.

Author keywords

Cloth manipulation; real datasets; robotic learning; motion capture; cloth simulation; sim-to-real gap

Scientific reference

F. Coltraro, J. Borràs, M. Alberich-Carramiñana and C. Torras. Tracking cloth deformation: a novel dataset for closing the sim-to-real gap for robotic cloth manipulation learning. International Journal of Robotics Research , 2025, to appear.