Single-view robot pose and joint angle estimation via render & compare - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

Single-view robot pose and joint angle estimation via render & compare

Résumé

We introduce RoboPose, a method to estimate the joint angles and the 6D camera-to-robot pose of a known articulated robot from a single RGB image. This is an important problem to grant mobile and itinerant autonomous systems the ability to interact with other robots using only visual information in non-instrumented environments, especially in the context of collaborative robotics. It is also challenging because robots have many degrees of freedom and an infinite space of possible configurations that often result in self-occlusions and depth ambiguities when imaged by a single camera. The contributions of this work are three-fold. First, we introduce a new render & compare approach for estimating the 6D pose and joint angles of an articulated robot that can be trained from synthetic data, generalizes to new unseen robot configurations at test time, and can be applied to a variety of robots. Second, we experimentally demonstrate the importance of the robot parametrization for the iterative pose updates and design a parametrization strategy that is independent of the robot structure. Finally, we show experimental results on existing benchmark datasets for four different robots and demonstrate that our method significantly outperforms the state of the art. Code and pre-trained models are available on the project webpage https://www.di.ens.fr/willow/research/robopose/.

Dates et versions

hal-03572205 , version 1 (14-02-2022)

Identifiants

Citer

Yann Labbé, Justin Carpentier, Mathieu Aubry, Josef Sivic. Single-view robot pose and joint angle estimation via render & compare. CVPR 2021 - Conference on Computer Vision and Pattern Recognition, Jun 2021, Virtual, France. ⟨hal-03572205⟩
52 Consultations
0 Téléchargements

Altmetric

Partager

More