Influence of eye-movements on multisensory stimulus localization: experiments, models and robotics applications - Archive ouverte HAL
Poster De Conférence Année : 2018

Influence of eye-movements on multisensory stimulus localization: experiments, models and robotics applications

Résumé

To make sense of their environment, both humans and robots need to construct a consistent perception from many sources of information (including visual and auditory stimulation). Multimodal merging thus plays a key role in human perception, for instance by lowering reaction times and detection thresholds. Psychophysics experiments have shown that humans are able to fuse information in a Bayes optimal way (Ernst & Banks, 2002), weighting each modality by its precision (i.e. the inverse of its perceived variance). Weights are usually estimated a posteriori from experimental data, but the mechanisms by which agents may estimate such precision online are not well studied. Some propositions may stem from sensorimotor accounts of perception and the predictive coding framework, with actions (e.g. saccades) being used to simultaneously estimate stimulus localization and sensory precision (Friston et al., 2011). In the context of the AMPLIFIER (Active Multisensory Perception and LearnIng For InteractivE Robots) project (2018-2022), we study the mutual influence of multisensory fusion and active perception. The project combines three complementary components. First, psychophysics experiments contribute to the confirmation and refining of hypotheses, by manipulating stimuli and task constraints (e.g., audio-visual discrepancy, stimulus presentation time, number of fixations or saccades during presentation) and estimating their effect on saccadic eye movements, as well as the effects of eye movements on the localization of the target. Second, neurocomputational models based on the dynamic neural field framework provide distributed representations of stimuli, allow to replicate experimental data, and to make predictions. Finally, such models will be coupled with active decision-making and developmental sensorimotor contingencies learning to be embedded on social robotic platforms, to improve human-robot interactions by providing more natural (gaze) interactions and more appropriate reactions in complex environments.
Fichier principal
Vignette du fichier
poster_WEM_2018_AMPLIFIER_v2_low.pdf (1.28 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01894621 , version 2 (14-07-2018)
hal-01894621 , version 1 (12-10-2018)

Identifiants

  • HAL Id : hal-01894621 , version 2

Citer

Mathieu Lefort, Jean-Charles Quinton, Simon Forest, Adrien Techer, Alan Chauvin, et al.. Influence of eye-movements on multisensory stimulus localization: experiments, models and robotics applications. Grenoble Workshop on Models and Analysis of Eye Movements, Jun 2018, Grenoble, France. , pp.1. ⟨hal-01894621v2⟩
659 Consultations
132 Téléchargements

Partager

More