Research Project
BURG: Benchmarks for Understanding Grasping
Type
European Project
Start Date
01/11/2019
End Date
31/07/2023
Project Code
PCI2019-103447

Staff
-
-
Torras, Carme
Researcher
-
Foix, Sergi
Researcher
-
Borràs, Júlia
Researcher
-
Grosch, Patrick John
Researcher
-
Garcia, Irene
PhD Student
Project Description
Project PCI2019-103447 funded by MCIN/ AEI /10.13039/501100011033 and by the "European Union"
Grasping rigid objects has been reasonably studied under a wide variety of settings. The common measure of success is a check of the robot to hold an object for a few seconds. This is not enough. To obtain a deeper understanding of object manipulation, we propose (1) a task-oriented part-based modelling of grasping and (2) BURG - our castle* of setups, tools, and metrics for community building around an objective benchmark protocol. The idea is to boost grasping research by focusing on complete tasks. This calls for attention on object parts since they are essential to know how and where the gripper can grasp given the manipulation constraints imposed by the task. Moreover, parts facilitate knowledge transfer to novel objects, across different sources (virtual/real data) and grippers, providing for a versatile and scalable system. The part-based approach naturally extends to deformable objects for which the recognition of relevant semantic parts, regardless of the object actual deformation, is essential to get a tractable manipulation problem. Finally, by focusing on parts we can deal easier with environmental constraints that are detected and used to facilitate grasping. Regarding benchmarking of manipulation, so far robotics suffered from incomparable grasping and manipulation work. Datasets cover only the object detection aspect. Object sets are difficult to get, not extendible, and neither scenes nor manipulation tasks are replicable. There are no common tools to solve the basic needs of setting up replicable scenes or reliably estimate object pose.
Hence, with the BURG benchmark, we propose to focus on community building through enabling and sharing tools for reproducible performance evaluation, including collecting data and feedback from different laboratories for studying manipulation across different robot embodiments. We will develop a set of repeatable scenarios spanning different levels of quantifiable complexity that involve the choice of the objects, tasks, and environments. Examples include fully quantified settings with layers of objects, adding deformable objects and environmental constraints. The benchmark will include metrics defined to assess the performance of both low-level primitives (object pose, grasp point and type, collision-free motion) as well as manipulation tasks (stacking, aligning, assembling, packing, handover, folding) requiring ordering as well as common sense knowledge for semantic reasoning.
* Burg, f
Consortium:
Technische Universität Wien, Austria (Coordinator)
University of Birmingham, United Kingdom
Istituto Italiano di Tecnologia, Italy
Institut de Robòtica i Informàtica Industrial (IRI), CSIC-UPC, Spain
Call: European CHIST-ERA 2017
Funds are provided by national research funding organisations. IRI is funded by the Spanish Ministry of Science, Innovation and Universities (MICIU).
More information at the Official BURG webpage
Project Publications
Journal Publications
-
I. Garcia-Camacho, J. Borràs, B. Calli, A. Norton and G. Alenyà. Household cloth object set: Fostering benchmarking in deformable object manipulation. IEEE Robotics and Automation Letters, 7(3): 5866-5873, 2022.
Abstract
Info
PDF
-
A. Carfi, T. Patten, Y. Kuang, A. Hammoud, M. Alameh, E. Maiettini, A.I. Weinberg, D. Faria, F. Mastrogiovanni, G. Alenyà, L. Natale, V. Perdereau, M. Vincze and A. Billard. Hand-Object Interaction: From Human Demonstrations to Robot Manipulation. Frontiers in Robotics and AI, 8: 714023, 2021.
Abstract
Info
PDF
-
I. Garcia-Camacho, M. Lippi, M.C. Welle, H. Yin, R. Antanova, A. Varava, J. Borràs, C. Torras, A. Marino, G. Alenyà and D. Kragic. Benchmarking bimanual cloth manipulation. IEEE Robotics and Automation Letters, 5(2): 1111-1118, 2020.
Abstract
Info
PDF
Conference Publications
-
I. Garcia-Camacho, J. Borràs, B. Calli, A. Norton and G. Alenyà. Cloth manipulation and perception competition, 2022 ICRA Workshop on Representing and Manipulating Deformable Objects, 2022, Philadelphia, pp. 4.
Abstract
Info
PDF
-
A. Boix, S. Foix and C. Torras. Garment manipulation dataset for robot learning by demonstration through a virtual reality framework, 24th Catalan Conference on Artificial Intelligence, 2022, Sitges, in Artificial Intelligence Research and Development, Vol 356 of Frontiers in Artificial Intelligence and Applications, pp. 199-208, IOS Press.
Abstract
Info
PDF
-
I. Garcia-Camacho, J. Borràs and G. Alenyà. Benchmarking cloth manipulation using action graphs: an example in placing flat, 1st IROS Workshop on Benchmarking of robotic grasping and manipulation: protocols, metrics and data analysis, 2021, Prague, Czech Republic (Virtual), pp. 1-3.
Abstract
Info
PDF
Follow us!