From motor to visually guided bimanual affordance learning - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Adaptive Behavior Année : 2020

From motor to visually guided bimanual affordance learning

Résumé

The mechanisms of how the brain orchestrates multi-limb joint action have yet to be elucidated and few computational sensorimotor (SM) learning approaches have dealt with the problem of acquiring bimanual affordances. We propose a series of bidirectional (forward/inverse) SM maps and its associated learning processes that generalize from uni- to bimanual interaction (and affordances) naturally, reinforcing the motor equivalence property. The SM maps range from a SM nature to a solely sensory one: full body control, delta SM control (through small action changes), delta sensory co-variation (how body-related perceptual cues covariate with object-related ones). We make several contributions on how these SM maps are learned: (1) Context and Behavior-Based Babbling: generalizing goal babbling to the interleaving of absolute and local goals including guidance of reflexive behaviors; (2) Event-Based Learning: learning steps are driven by visual, haptic events; and (3) Affordance Gradients: the vectorial field gradients in which an object can be manipulated. Our modeling of bimanual affordances is in line with current robotic research in forward visuomotor mappings and visual servoing, enforces the motor equivalence property, and is also consistent with neurophysiological findings like the multiplicative encoding scheme.
Fichier non déposé

Dates et versions

hal-03445197 , version 1 (23-11-2021)

Identifiants

Citer

Martí Sánchez-Fibla, Sébastien Forestier, Clément Moulin-Frier, Jordi-Ysard Puigbò, Paul Fmj Verschure. From motor to visually guided bimanual affordance learning. Adaptive Behavior, 2020, 28 (2), pp.63-78. ⟨10.1177/1059712319855836⟩. ⟨hal-03445197⟩
23 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More