Embodied Localization in Visually-guided Walk of Humanoid Robots
Résumé
Humanoid robots are conceived to resemble the body and comportment of the human beings. Among the behavior repertoire, the possibility of executing visually-guided tasks is crucial for individual adaptation and relies on the on-board sensory system. However, the research on walk and localization is far from conclusive. Given the difficulties in the processing of the visual feedback, some studies have treated the problem by placing external sensors on the environment; thus neglecting the corporal metaphor. Others, despite exploring on-board solutions; have relied on an extensive model of the environment, thus considering the system as an information processing unit, abstracted from a body. This work presents a methodology to achieve embodied localization to serve visually-guided walk. The solution leans on robust segmentation from monocular vision, ego-cylindrical localization, and minimal knowledge about stimuli in the environment.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...