Research Project

PYERCING: Open your ears: predicting object material from impact sounds


UPC Project

Start Date


End Date


Project Code


Default project illustration


Project Description

This is a project within the Call for IRI MdM Internal Projects 2020 a call of R&D projects addressed to IRI early career researchers, under the María de Maeztu Unit of Excellence Programme.

Whether intentionally scratched or naturally manipulated during daily activities, everyday objects emit characteristic sounds that may unveil some of their physical properties, such as material, weight, stiffness, surface roughness, and even shape. These physical properties are essential not only for perception, but also for manipulation and grasping. However, most of them are typically neglected in the perception-action-learning loop. This project aims at building a dataset of impact sounds for the objects of the YCB dataset and benchmark, and to develop a deep learning model able to automatically predict an object material from impact sounds.
The results of this project could subsequently be used to infer other physical properties which are critical for grasping but typically unavailable, such as surface friction and weight and, more in general, to endow robots with a policy that leverages multimodal feedback from vision and audio.

Project Publications

Conference Publications

  • M. Dimiccoli, S. Patni, M. Hoffmann and F. Moreno-Noguer. Recognizing object surface material from impact sounds for robot manipulation, 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2022, Kyoto, pp. 9280-9287.

    Open/Close abstract Abstract Info Info pdf PDF