Active High Dynamic Range Mapping for Dense Visual SLAM
Résumé
— Acquiring High Dynamic Range (HDR) photos from several images, with an active shutter providing different exposures (sensor integration periods), has been widely commercialised in photography for static camera positions. In the case of a mobile video sensor (as is the case in robotics), this problem is more difficult due to real-time motion of the sensor which transforms the perspective between the acquired images. HDR approaches for a set of images from different perspectives have therefore been significantly overlooked since this would require sophisticated dense mapping approaches to eliminate the motion component. Recent dense visual SLAM (Simultaneous Localization And Mapping) approaches provide this framework, however, few works have attempted to perform HDR visual SLAM. Current approaches are thus highly depen-dant on illumination conditions and camera shutter settings. In this paper a new approach is proposed that enables 3D HDR environment maps to be acquired actively from a dynamic set of images in real-time. The 6 DOF pose, the dense scene structure and the HDR texture map will be estimated simultaneously with the objective of maximising the dynamic range. This will allow to obtain a radiance map of the scene by fusing a real-time stream of low dynamic range images (LDR) into a graph of HDR key-frame images. In particular, a method is proposed to actively control the shutter based on information theory to optimise the information content of the 3D HDR environment map for RGB-D sensors. As will be shown in the results, a 3D HDR environment map allows robot localisation and mapping (visual SLAM) to take actively advantage of varying luminosity in different parts of the scene.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...