Abstract - In the future, robots are expected to autonomously interact and/or collaborate with humans, who will increase the uncertainty during the execution of tasks, provoking re-planning and online adaptations of robots' plans. Hence, trustworthy robots must be able to store, retrieve and narrate important knowledge about their collaborations and adaptations. In this article, we propose a sound methodology that integrates three main elements. First, an ontology for collaborative robotics and adaptation to model the domain knowledge. Second, an episodic memory for time-indexed knowledge storage and retrieval. Third, a novel algorithm to extract the relevant knowledge and generate textual explanatory narratives. The algorithm produces three different types of outputs for diverse uses and preferences, based on the degree of specificity. We conducted a pilot study to evaluate the information quality of the narratives. Finally, we discuss how the methodology can be generalized and used with other ontologies and experiences. This work boosts robot explainability, especially in cases where robots need to narrate the details of their experiences.
Warm up video to provide users with enough context to understand their task.
Video with explanatory narratives for Group 1 (specificity level 1).
Video with explanatory narratives for Group 2 (specificity level 2).
Video with explanatory narratives for Group 3 (specificity level 3).
During the evaluation of our work, we conducted a pilot study, here you can download it.