Interpersonal stance recognition using non-verbal signals on several time windows - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2012

Interpersonal stance recognition using non-verbal signals on several time windows

Résumé

We present a computational model for interpreting non-verbal signals of a user during an interaction with a virtual character in order to obtain a representation of his interpersonal stance. Our model starts, on the one hand, from the analysis of multimodal signals. On the other hand, it takes into account the temporal patterns of the interactants behaviors. That is, it analyses signals and reactions to signals in their immediate context, as well as features of signal production patterns and reaction patterns on different time windows : signal reaction, sentence reaction, conversation topic, whole interaction. In this paper, we propose a first model parameterized using data obtained from the literature on the expressions of stances through interpersonal behavior.
Fichier principal
Vignette du fichier
wacai2012reviewed.pdf (217.83 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01074857 , version 1 (15-10-2014)

Identifiants

  • HAL Id : hal-01074857 , version 1

Citer

Mathieu Chollet, Magalie Ochs, Catherine Pelachaud. Interpersonal stance recognition using non-verbal signals on several time windows. Workshop Affect, Compagnon Artificiel, Interaction, Nov 2012, Grenoble, France. ⟨hal-01074857⟩
237 Consultations
98 Téléchargements

Partager

Gmail Facebook X LinkedIn More