Motor and visual influences on auditory neural processing during speaking and listening
Résumé
During speaking or listening, endogenous motor or exogenous visual processes have been shown to fine-tune the auditory neural processing of incoming acoustic speech signal. To compare the impact of these cross-modal effects on auditory evoked responses, two sets of speech production and perception tasks were contrasted using EEG. In a first set, participants produced vowels in a self-paced manner while listening to their auditory feedback. Following the production task, they passively listened to the entire recorded speech sequence. In a second set, the procedure was identical except that participants also watched online their own articulatory movements. While both endogenous motor and exogenous visual processes fine-tuned auditory neural processing, these cross-modal effects were found to act differentially on the amplitude and latency of auditory evoked responses. A reduced amplitude was observed on auditory evoked responses during speaking compared to listening, irrespective of the auditory or audiovisual feedback. Adding orofacial visual movements to the acoustic speech signal also speeded up the latency of auditory evoked responses, irrespective of the perception or production task. Taken together, these results suggest distinct motor and visual influences on auditory neural processing, possibly through different neural gating and predictive mechanisms.
Origine | Fichiers produits par l'(les) auteur(s) |
---|