Active Multisensory Perception and LearnIng For InteractivE Robots - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

Active Multisensory Perception and LearnIng For InteractivE Robots

Résumé

The AMPLIFIER (Active Multisensory Perception and LearnIng For InteractivE Robots) project (2018-2022) will study how multisensory fusion and active perception can influence each other during the developmental sensori-motor loop of an autonomous agent. Psychophysics experiments will provide insights on how active perception may influence multisensory fusion in human. Using neural fields, a multi-scale computational neuroscience paradigm, we want to model the behavioral observations in order to transfer and to extend the extracted functional properties to social robots. Especially, we target to provide more natural interactions with humans by allowing the robot to have a better understanding and more appropriate contextual reactions to its environment.
Fichier principal
Vignette du fichier
ICDL_17___Multimodality.pdf (87.27 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01839427 , version 1 (14-07-2018)

Identifiants

  • HAL Id : hal-01839427 , version 1

Citer

Mathieu Lefort, Jean-Charles Quinton, Marie Avillac, Adrien Techer. Active Multisensory Perception and LearnIng For InteractivE Robots. Workshop on Computational Models for Crossmodal Learning - IEEE ICDL-EPIROB, Sep 2017, Lisbon, Portugal. pp.2. ⟨hal-01839427⟩
495 Consultations
145 Téléchargements

Partager

Gmail Facebook X LinkedIn More