Self-calibrating smooth pursuit through active efficient coding - Archive ouverte HAL Access content directly
Journal Articles Robotics and Autonomous Systems Year : 2014

Self-calibrating smooth pursuit through active efficient coding

Abstract

This paper presents a model for the autonomous learning of smooth pursuit eye movements based on an efficient coding criterion for active perception. This model accounts for the joint development of visual encoding and eye control. Sparse coding models encode the incoming data at two different spatial resolu-tions and capture the statistics of the input in spatio-temporal basis functions. A reinforcement learner controls eye velocity so as to maximize a reward signal based on the efficiency of the encoding. We consider the embodiment of the approach in the iCub simulator and real robot.Motion perception and smooth pursuit control are not explicitly expressed as tasks for the robot to achieve but emerge as the result of the system's active attempt to efficiently encode its sensory inputs. Experiments demonstrate that the proposed approach is self-calibrating and robust to strong perturbations of the perception-action link.
Fichier principal
Vignette du fichier
ctRAS-manuscript.pdf (2.4 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01113340 , version 1 (05-02-2015)

Identifiers

Cite

Céline Teulière, S Forestier, L Lonini, Cong Zhang, Y Zhao, et al.. Self-calibrating smooth pursuit through active efficient coding. Robotics and Autonomous Systems, 2014, http://dx.doi.org/10.1016/j.robot.2014.11.006. ⟨10.1016/j.robot.2014.11.006⟩. ⟨hal-01113340⟩
188 View
345 Download

Altmetric

Share

Gmail Facebook X LinkedIn More