Joint material prediction and shape attribute recognition of household objects

Work default illustration



  • If you are interested in the proposal, please contact with the supervisors.


Whether intentionally scratched or naturally manipulated during daily activities, everyday objects emit characteristic sounds that may unveil some of their physical properties, such as material, weight, stiffness, surface roughness, and even shape. These physical properties are essential not only for perception, but also for manipulation and grasping. However, most of them are typically neglected in the perception-action-learning loop.

In this project the student will use a dataset of impact sounds recently built at the IRI to develop a deep learning model able to automatically predict an object material from impact sounds. Audio data will subsequently used together with visual data to predict jointly material and shape attribute.

The work is under the scope of the following projects:

  • IPALM: Interactive Perception-Action-Learning for Modelling Objects (web)