Aligned motion-direction information for touch and vision in the human brain
Résumé
Motion directions can be perceived through vision and touch. Do motion directions align across the senses somewhere in the brain, and if so, using which frame of reference? This is a non-trivial computational problem because vision and touch are initially coded using different spatial frames of reference, and because our limbs move constantly to adopt different postures. In the first experiment, we used fMRI to identify motion-selective regions in vision and touch. In addition to sensory specific motion selective regions, we observed that the middle occipito-temporal region (hMT+/V5) is motion selective across the senses. In another experiment, we delivered directional visual and tactile motion stimuli across different hand postures. Multivariate Pattern Analysis (MVPA) revealed that motion directions can be decoded in both vision and touch. Interestingly, tactile motion directions could be decoded in both body-centered and externally-centered coordinate systems. However, crossmodal decoding revealed that visual motion directions align with tactile directions only using an externally-centred coordinate system. Our results show that motion directions in vision and touch are aligned in hMT+/V5 relying on a common external frame of reference
Domaines
NeurosciencesOrigine | Fichiers produits par l'(les) auteur(s) |
---|