Psychophysical investigation of localization of audio-tactile stimuli in active touch
Résumé
Exploring our environment through touch often entails integration of tactile cues with auditory and/or visual inputs. The mechanisms by which touch integrates with other sensory modalities in conditions of active touch remain poorly understood. Here, our aim was to investigate auditory-tactile integration in the context of spatial localization of transient changes in friction. Using psychophysics, we investigated the precision of participants localizing tactile, auditory, and audio-tactile stimuli relative to a visually-displayed midline, in active touch. In Experiment 1, conditions were presented in separate blocks, and participants were informed about modality prior to each block. In Experiment 2, conditions were fully interleaved, and participants did not receive information about modality prior to each trial. For both experiments, we estimated within-subject differences across conditions, and assessed whether bimodal stimulus presentation improves localization precision (slope of the psychometric function). In both experiments, similar slope values were observed for the tactile-only and the audiotactile conditions, both displaying higher precision compared to the auditory-only condition. However, we observed a reduction in bias (greater accuracy) in the bimodal condition when participants could not predict modality. Our results suggest that participants relied more on tactile stimuli to perform the task, thus precision was not improved by concurrent auditory stimulation. While it is possible that participants largely ignored auditory cues in the bimodal condition when information about modality was given, when modality was not predictable, integration of auditory and tactile stimuli led to a more accurate spatial haptic representation, albeit without a significant reduction in uncertainty.
Domaines
NeurosciencesOrigine | Fichiers produits par l'(les) auteur(s) |
---|---|
Licence |