CosySlam: investigating object-level SLAM for detecting locomotion surfaces - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2022

CosySlam: investigating object-level SLAM for detecting locomotion surfaces

Résumé

While blindfolded legged locomotion has demonstrated impressive capabilities in the last few years, further progresses are expected from using exteroceptive perception to better adapt the robot behavior to the available surfaces of contact. In this paper, we investigate whether mono cameras are suitable sensors for that aim. We propose to rely on object-level SLAM, fusing RGB images and inertial measurements, to simultaneously estimate the robot balance state (orientation in the gravity field and velocity), the robot position, and the location of candidate contact surfaces. We used CosyPose, a learning-based object pose estimator for which we propose an empirical uncertainty model, as the sole front-end of our visual inertial SLAM. We then combine it with inertial measurements which ideally complete the system observability, although extending the proposed approach would be straightforward (e.g. kinematic information about the contact, or a feature based visual front end). We demonstrate the interest of object-based SLAM on several locomotion sequences, by some absolute metrics and in comparison with other mono SLAM.
Fichier principal
Vignette du fichier
CosySLAM_2022.pdf (4.25 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03351438 , version 1 (22-09-2021)
hal-03351438 , version 2 (03-03-2022)

Identifiants

  • HAL Id : hal-03351438 , version 2

Citer

César Debeunne, Médéric Fourmy, Yann Labbé, Pierre-Alexandre Léziart, Guilhem Saurel, et al.. CosySlam: investigating object-level SLAM for detecting locomotion surfaces. 2022. ⟨hal-03351438v2⟩
671 Consultations
337 Téléchargements

Partager

More