Research Project
CHLOE-Map: Geometric Coordinates to Navigate the Configuration Space of Cloth-Like Objects for Robotic Manipulation
Type
National Project
Start Date
01/09/2024
End Date
31/08/2027
Project Code
PID2023-152259OB-I00
Staff
-
-
Alenyà, Guillem
Principal Investigator
-
Torras, Carme
Researcher
-
Foix, Sergi
Researcher
-
Jiménez, Pablo
Researcher
-
Barrue, Cristian
Researcher
-
Civit, Aniol
PhD Student
-
Salido, Pablo
Support
Project Description
Project PID2023-152259OB-I00 funded by MCIN/ AEI /10.13039/501100011033 and by ERDF, UE
Despite their permanent presence in our domestic and industrial environments, textile objects are still a challenge for robots due to their infinite-dimensional configuration space (C-space) for which models are inaccurate and perception methods difficult due to self-occlusions.
Most popular works in literature are data-driven which obtain good results for simple tasks but have poor generalization capabilities and require large amounts of data that has to be manually annotated if learned from real data, or with large sim-to-real gaps if learned from simulation. Model-based approaches are highly dimensional with a prohibitive computational cost for real applications. There is an open problem challenge to combine both data-driven and model-based approaches that CHLOE.Map will target at its core by proposing to develop an analytical encoding of cloth that can capture deformation complexity and can be combined with data-driven methods to make them less-data hungry and more generalizable.
The main objective of the project is to develop an integral framework for dynamic cloth-like object manipulation with a self-explainable system to facilitate interaction. To achieve this goal, the main innovative idea is the development of a set of coordinates that navigate through the complex configuration space of cloth, and that are developed using topology and geometric indexes based on parts of the object, not the whole mesh. The project will use a seminal result [Coltraro2023] that shows a preliminary version of coordinates for cloth that allow to classify deformation states based solely on distance, but further development is needed to implement how these coordinates induce a model representation that can greatly simplify the planning and perception for state estimation. Contrary to what data-driven approaches allow, we will exploit such analytical representation to infer reasoning about the safety and robustness of the synthesized manipulations.
In addition, CHLOE-Map will further extend this line of research to combine it with data-driven approaches to include scene state information such as environmental constraints, grasping state and dynamics to effectively bridge the gap between the high and the low-level aspects of cloth manipulation. In addition, we will integrate the representation of the C-space, together with additional information of the state of the manipulation step to develop a knowledge representation that enables robots to store memory information that can be extracted in a meaningful way, allowing domain experts to understand and eventually debug the decision system.
The project is divided into 4 objectives: The development of the C-space representation that allows to plan paths navigating different deformation states(O1), the implementation of perception methods to be able to identify our developed representation in real time (O2), the extraction of physical requirements to enable the execution of planned state transitions using robots, including the development of a grasp framework for textiles (O3) and finally, the development of the knowledge representation to extract the relevant information from the system (O4).
We expect CHLOE-Map to greatly impact the literature of cloth manipulation by proposing a fresh novel perspective putting the representation of the C-space in the center, as a pivotal point from which the planning, grasping and perception methods will be developed and integrated with data-driven methods.
Follow us!