PhD Thesis

User-centric Explainable Human-Robot Interaction

Work default illustration



  • Started: 01/10/2023


Currently, many works focus on algorithms that generate explanations, and evaluate the impact on user trust and understanding over robots afterwards. We suggest a user-centric approach for explainability from the very beginning of the design process. We argue that in order to encourage humans to trust and eagerly use robots, robots should be usable: they should perform well and provide intuitive interactions while supporting understanding. Concretely, we propose a participatory design methodology with three main steps: (1) to find together with users what makes an application usable, (2) to co-design the interface to make sure it is intuitive and understandable, and only then (3) to redefine and develop the robot's functionality and autonomous behaviours. This framework should support the design of explainable robots that can adapt to different user types and personalize to specific individual needs. We expect to evaluate the proposed participatory design framework in several domains with different stakeholders.