Anticipating False Beliefs and Planning Pertinent Reactions in Human-Aware Task Planning with Models of Theory of Mind - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Anticipating False Beliefs and Planning Pertinent Reactions in Human-Aware Task Planning with Models of Theory of Mind

Anthony Favier
Shashank Shekhar
Rachid Alami

Résumé

It is essential for a collaborative robot to consider Theory of Mind (ToM) when interacting with humans. Indeed, performing an action in the absence of another agent may create false beliefs like in the well-known Sally & Anne Task (Wimmer and Perner 1983). The robot should be able to detect, react to, and even anticipate false beliefs of other agents with a detrimental impact on the task to achieve. Currently, ToM is mainly used to control the task execution and resolve in a reactive way the detrimental false beliefs. Some works introduce ToM at the planning level by considering distinct beliefs, and we are in this context. This work proposes an extension of an existing human-aware task planner and effectively allows the robot to anticipate a false human belief ensuring a smooth collaboration through an implicitly coordinated plan. First, we propose to capture the observability properties of the environment in the state description using two observability types and the notion of co-presence. They allow us to maintain distinct agent beliefs by reasoning directly on what agents can observe through specifically modeled Situation Assessment processes, instead of reasoning of action effects. Then, thanks to the better estimated human beliefs, we can predict if a false belief with adverse impact will occur. If that is the case then, first, the robot's plan can be to communicate minimally and proactively. Second, if this false belief is due to a non-observed robot action, the robot's plan can be to postpone this action until it can be observed by the human, avoiding the creation of the false belief. We implemented our new conceptual approach, prove its soundness and completeness, discuss its effectiveness qualitatively, and show experimental results on three novel domains.
Fichier principal
Vignette du fichier
PlanRob-23_paper_13.pdf (597.01 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04163435 , version 1 (17-07-2023)

Identifiants

  • HAL Id : hal-04163435 , version 1

Citer

Anthony Favier, Shashank Shekhar, Rachid Alami. Anticipating False Beliefs and Planning Pertinent Reactions in Human-Aware Task Planning with Models of Theory of Mind. PlanRob Workshop - International Conference on Automated Planning and Scheduling (ICAPS 2023), Jul 2023, Prague, Czech Republic. ⟨hal-04163435⟩
116 Consultations
85 Téléchargements

Partager

More