Aligned motion-direction information for touch and vision in the human brain
Résumé
Rationale: Motion directions can be perceived through vision and touch. Do motion directions align across the senses somewhere in the brain, and if so, using which frame of reference? This is a non- trivial computational problem because vision and touch are initially coded using different spatial frames of reference, and our limbs move constantly to adopt different postures.
Methods: We conducted Functional Magnetic Resonance Imaging (fMRI) experiments to study the representation of motion directions in different areas of the brain. In the first experiment, we designed motion localizers to identify motion-selective regions in both vision and touch. In the second experiment, we delivered directional visual and tactile motion stimuli across different hand postures.
Results: In addition to sensory specific motion selective regions, whole brain univariate analysis showed that the middle occipito-temporal region (hMT+/V5) is motion selective across the senses. Multivariate Pattern Analysis (MVPA) additionally revealed that motion directions can be decoded in both vision and touch in hMT+/V5. Interestingly, tactile motion directions could be decoded in both body-centered and externally centered coordinate systems. However, crossmodal decoding revealed that the representation of visual directions aligns with those of tactile directions only when coded using an external frame of reference.
Discussion: Our results show that motion directions in vision and touch are aligned in hMT+/V5 relying on a common external frame of reference.
Domaines
NeurosciencesOrigine | Fichiers produits par l'(les) auteur(s) |
---|