GPS-Enhanced RGB-D-IMU Calibration for Accurate Pose Estimation
Résumé
The precise calibration of all sensor modalities is crucial for multi-modal sensor fusion in robotics, which is used for 3D pose estimation (odometry). To achieve optimal calibration, deterministic filters or non-deterministic optimization models can be used to estimate time-invariant intrinsic and extrinsic parameters. Using a GPS-aided optimizer bootstrapping algorithm, we introduce a novel optimization-based approach for intrinsic and extrinsic calibration of an RGB-D-IMU visual-inertial setup. Our front-end pipeline relies on an optical flow Visual Odometry (VO) method to obtain reliable initial estimates for the RGB camera intrinsics and trajectory. In addition to calibrating all time-invariant properties, our back-end optimizes spatio-temporal parameters such as the target’s pose, 3D point cloud, and IMU biases. The proposed complete RGB-D-IMU setup calibration algorithm is validated on both real-world and high-quality simulated sequences. We conducted ablation studies on ground and aerial vehicles to assess the contribution of each sensor in the multi-modal (RGB-D-IMU-GPS) setup to the vehicle’s pose estimation accuracy. Our GitHub repository contains the proposed algorithm implementation: https://github.com/AbanobSoliman/HCALIB