A sensor fusion framework for wearable sensor suites
Résumé
We present a generalized mapping framework that can withstand the challenges incurred by working in unstructured outdoor environments, such as a snowy forest. The proposed method takes advantage of a sensor fusion scheme, where sensors such as cameras and lidars are used in order to reconstruct the surrounding natural environment. Although mapping techniques such as SLAM and ICP cannot themselves properly handle the complexity of natural scenes, they do have the potential to contribute to the global solution in a proposed sensor fusion scheme, based on a factor graph architecture. In this paper, we propose an innovative map registration scheme for visual maps, and show how it can improve the reconstruction quality after data fusion. We also analyze the behavior and sensitivity of factor graphs to uncertainties, by comparing the residual error with different parameter combinations such as variances, using an exhaustive grid search with ground truth comparison. Finally, we suggest an ICP-inferred loop closure, capable of compensating position and attitude drift. The experiments are carried out by recording in a snowy forest using a wearable sensor suite. In the experiments, ground truth was acquired using a millimeter-accurate total station. The proposed framework is shown to be robust and likewise capable of providing estimates that are otherwise unattainable using classic techniques, such as visual SLAM and ICP for lasers. Finally, a visible improvement in the map reconstruction quality is shown, and the proposed framework achieves a translation error of 0.36 meters.
Domaines
Sciences de l'ingénieur [physics]Origine | Fichiers produits par l'(les) auteur(s) |
---|