Shared knowledge in natural conversations: can entropy metrics shed light on information transfers? - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Shared knowledge in natural conversations: can entropy metrics shed light on information transfers?

Résumé

The mechanisms underlying human communication have been under investigation for decades, but the answer to how understanding between locutors emerges remains incomplete. Interaction theories suggest the development of a structural alignment between the speakers, allowing for the construction of a shared knowledge base (common ground). In this paper, we propose to apply metrics derived from information theory to quantify the amount of information exchanged between participants, the dynamics of information exchanges, to provide an objective way to measure the common ground instantiation. We focus on a corpus of free conversations augmented with prosodic segmentation and an expert annotation of thematic episodes. We show that during free conversations, the amount of information remains globally constant at the scale of the conversation, but varies depending on the thematic structuring, underlining the role of the speaker introducing the theme. We propose an original methodology applied to uncontrolled material.

Domaines

Linguistique
Fichier principal
Vignette du fichier
2022.conll-1.15.pdf (10.11 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-04151675 , version 1 (05-07-2023)

Identifiants

  • HAL Id : hal-04151675 , version 1

Citer

Eliot Maës, Philippe Blache, Leonor Becerra-Bonache. Shared knowledge in natural conversations: can entropy metrics shed light on information transfers?. 26th Conference on Computational Natural Language Learning (CoNLL), Dec 2022, Abu Dhabi, United Arab Emirates. pp.213-227. ⟨hal-04151675⟩
40 Consultations
11 Téléchargements

Partager

Gmail Facebook X LinkedIn More