Extrinsic calibration between a multi-layer lidar and a camera
Résumé
In this paper, we present a novel approach for solving the extrinsic calibration between a camera and a multi-layer laser range finder. Our approach is oriented for intelligent vehicle applications, where the separation distance between sensors frames are frequently very important. For this purpose, we use a circle-based calibration object because its geometry allows us to obtain not only an accurate estimation pose by taking advantage of the 3D multi-layer laser range finder perception but also a simultaneous estimation of the pose in the camera frame and the camera intrinsic parameters. These advantages simplify the calibration task in outdoor environments. The method determines the relative position of the sensors by estimating sets of corresponded features and by solving the classical absolute orientation problem. The proposed method is evaluated by using different synthetics environments and real data. An error propagation analysis is made in order to estimate the calibration accuracy and the confidence intervals. Finally, we present a laser data projection into images to validate the consistency of the results.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...