Robust Data Fusion of Multi-modal Sensory Information for Mobile Robots
Résumé
Urban Search and Rescue missions for mobile robots require reliable state estimation systems resilient to conditions given by the dynamically changing environment. We design and evaluate a data fusion system for localization of a mobile skid-steer robot intended for USAR missions. We exploit a rich sensor suite including both proprioceptive (inertial measurement unit and tracks odometry) and exteroceptive sensors (omnidirectional camera and rotating laser rangefinder). To cope with the specificities of each sensing modality (such as significantly differing sampling frequencies), we introduce a novel fusion scheme based on Extended Kalman filter for 6DOF orientation and position estimation. We demonstrate the performance on field tests of more than 4.4 km driven under standard USAR conditions. Part of our datasets include ground truth positioning; indoor with a Vicon motion capture system and outdoor with a Leica theodolite tracker. The overall median accuracy of localization—achieved by combining all the four modalities—was 1.2 % and 1.4 % of the total distance traveled, for indoor and outdoor environments respectively. To identify the true limits of the proposed data fusion we propose and employ a novel experimental evaluation procedure based on failure case scenarios. This way we address the common issues like: slippage, reduced camera field of view, limited laser rangefinder range, together with moving obstacles spoiling the metric map. We believe such characterization of the failure cases is a first step towards identifying the behavior of state estimation under such conditions. We release all our datasets to the robotics community for possible benchmarking.
Domaines
Robotique [cs.RO]Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...