Publication
Dynamics of mental models: Objective vs. subjective user understanding of a robot in the wild
Journal Article (2025)
Journal
IEEE Robotics and Automation Letters
Pages
7755-7762
Volume
10
Number
8
Doc link
https://doi.org/10.1109/LRA.2025.3579217
File
Authors
-
Gebelli Guinjoan, Ferran
-
Garrell Zulueta, Anaís
-
Lemaignan, Séverin
-
Ros, Raquel
Projects associated
Abstract
In Human-Robot Interaction research, assessing how humans understand the robots they interact with is crucial, particularly when studying the impact of explainability and transparency. Some studies evaluate objective understanding by analysing the accuracy of users' mental models, while others rely on perceived, self-reported levels of subjective understanding. We hypothesise that both dimensions of understanding may diverge, thus being complementary methods to assess the effects of explainability on users. In our study, we track the weekly progression of the users' understanding of an autonomous robot operating in a healthcare centre over five weeks. Our results reveal a notable mismatch between objective and subjective understanding. In areas where participants lacked sufficient information, the perception of understanding, i.e. subjective understanding, raised with increased contact with the system while their actual understanding, objective understanding, did not. We attribute these results to inaccurate mental models that persist due to limited feedback from the system. Future research should clarify how both objective and subjective dimensions of understanding can be influenced by explainability measures, and how these two dimensions of understanding affect other desiderata such as trust or usability.
Categories
service robots.
Author keywords
Social HRI, long term interaction, human-centered robotics
Scientific reference
F. Gebelli, A. Garrell Zulueta, S. Lemaignan and R. Ros. Dynamics of mental models: Objective vs. subjective user understanding of a robot in the wild. IEEE Robotics and Automation Letters, 10(8): 7755-7762, 2025.
Follow us!