Decoding sounds depicting hand-object interactions in primary somatosensory cortex
Résumé
Neurons, even in earliest sensory regions of cortex, are subject to a great deal of contextual influences from both within and across modality connections. Recent work has shown that primary sensory areas can respond to and in some cases discriminate stimuli not of their target modality: for example, primary somatosensory cortex (SI) discriminates visual images of graspable objects. In the present work, we investigated whether SI would discriminate sounds depicting hand-object interactions (e.g. bouncing a ball). In a rapid event-related functional magnetic resonance imaging (fMRI) experiment, participants listened attentively to sounds from three categories: hand-object interactions, and control categories of pure tones and animal vocalizations, while performing a one-back repetition detection task. Multi-voxel pattern analysis revealed significant decoding of different hand-object interactions within SI, but not for either control category. Crucially, in the hand-sensitive voxels defined from an independent tactile localizer, decoding accuracies were significantly higher for hand-object interactions compared to pure tones in left SI. Our findings indicate that simply hearing sounds depicting familiar hand-object interactions elicit different patterns of activity in SI, despite the complete absence of tactile stimulation. These results highlight the rich information that can be transmitted across sensory modalities even to primary sensory areas.
Domaines
NeurosciencesOrigine | Fichiers produits par l'(les) auteur(s) |
---|