A multi-sensor calibration toolbox for Kinect : Application to Kinect and laser range finder fusion.
Résumé
In most of indoor mobile robot navigation, obstacle avoidance is a crucial task and should be reliable. Fusion of sources of data can be used to detect any obstacle shapes. Laser range finder is usually used to deal with this task in presence of simple obstacles. 3D sensors as Kinect provide 3D information which can be used for more complex obstacle detection. However, opposed to laser range finder, Kinect has strong limitations like measuring range or the field of view. This paper proposes a full calibration of different sensors which can be coupled with the Kinect sensor by Microsoft. The approach can also be applied to a large variety of 3D depth sensors like time-of-flight cameras, 3D LIDAR or RADAR. The basic idea is to compute the Euclidian transformation between each sensor. In this paper, we show that chessboard methods used to calibrate color cameras can be extended to deal with 3D depth sensors like Kinect. We show in a real experiment, the benefit of the fusion based on the calibration results, in order to detect complex obstacles reliably.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...