Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Cognitive Neuroscience Année : 2017

Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions

Résumé

Action recognition has been found to rely not only on sensory brain areas but also partly on the observer's motor system. However, whether distinct auditory and visual experiences of an action modulate sensorimotor activity remains largely unknown. In the present sparse sampling fMRI study, we determined to which extent sensory and motor representations interact during the perception of tongue and lip speech actions. Tongue and lip speech actions were selected because tongue movements of our interlocutor are accessible via their impact on speech acoustics but not visible because of its position inside the vocal tract, whereas lip movements are both " audible " and visible. Participants were presented with auditory, visual, and audiovisual speech actions, with the visual inputs related to either a sagittal view of the tongue movements or a facial view of the lip movements of a speaker, previously recorded by an ultrasound imaging system and a video camera. Although the neural networks involved in visual visuo-lingual and visuo-facial perception largely overlapped, stronger motor and somato-sensory activations were observed during visuo-lingual perception. In contrast, stronger activity was found in auditory and visual cortices during visuo-facial perception. Complementing these findings, activity in the left premotor cortex and in visual brain areas was found to correlate with visual recognition scores observed for visuo-lingual and visuo-facial speech stimuli , respectively, whereas visual activity correlated with RTs for both stimuli. These results suggest that unimodal and multi-modal processing of lip and tongue speech actions rely on common sensorimotor brain areas. They also suggest that visual processing of audible but not visible movements induces motor and visual mental simulation of the perceived actions to facilitate recognition and/or to learn the association between auditory and visual signals.
Fichier principal
Vignette du fichier
JOCN_a_01057-Treille_Proof1_corrected.pdf (5.96 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01484978 , version 1 (08-03-2017)

Identifiants

Citer

Avril Treille, Coriandre Emmanuel Vilain, Thomas Hueber, Laurent Lamalle, Marc Sato. Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions. Journal of Cognitive Neuroscience, 2017, 29 (3), pp.448-466. ⟨10.1162/jocn_a_01057⟩. ⟨hal-01484978⟩
467 Consultations
225 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More