Interpretable Prediction of Brain Activity during Natural Social Interactions using Multimodal Behavioral Signals
Résumé
We present an analytical framework aimed at predicting the local brain activity of individuals during a conversation with another human or a humanoid robot based on multimodal recordings of their behavior. In this framework, we first extract high-level features from the raw behavioral recordings of both interlocutors. Then, classifiers are applied to predict binarized brain activity from these features using a dynamic prediction model. Here, we focus on brain regions involved in social interactions, both speech processing involved areas in conversations and information integration areas, in order to validate our framework. This framework not only predicts local brain activity significantly better than random, but it also identifies the behavioral features required for this prediction depending on the brain area under investigation and on the nature of the conversational partner. In the left Superior Temporal Sulcus, perceived speech is the most important behavioral feature for predicting brain activity, regardless of the agent, while multiple features, which differ between the human and robot interlocutors, contribute to prediction in regions involved in social signal integration, such as the TemporoParietal Junction. This framework allows us to study how multiple behavioral signals from different modalities are integrated in individual brain regions during inherently complex unconstrained natural social interactions.
Origine | Fichiers produits par l'(les) auteur(s) |
---|