Vision-guided motion primitives for humanoid reactive walking: decoupled vs. coupled approaches
Résumé
This paper proposes a novel visual servoing ap-proach to control the dynamic walk of a humanoid robot. Online visual information is given by an on-board camera. It is used to drive the robot towards a specific goal. Our work is built upon a recent reactive pattern generator that make use of Model Predictive Control (MPC) to modify footsteps, center of mass and center of pressure trajectories to track a reference velocity. The contribution of the paper is to formulate the MPC problem considering visual feedback. We compare our approach with a scheme decoupling visual servoing and walking gait generation. Such a decoupled scheme consists in first, computing a reference velocity from visual servoing; then, the reference velocity is the input of the pattern generator. Our MPC based approach allows to avoid a number of limitations that appears in decoupled methods. In particular visual constraints can be introduced directly inside the locomotion controller, while camera motions do not have to be accounted for separately. Both approaches are compared numerically and validated in simulation. Our MPC method shows a faster convergence.
Domaines
Robotique [cs.RO]Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...