Pedestrian Track Estimation with Handheld Monocular Camera and Inertial-Magnetic Sensor for Urban Augmented Reality
Résumé
Urban environment constitutes a challenging area for pedestrian navigation. However, with the recent increase of pedestrians owning devices (e.g. smartphones), complementary data provided by integrated low cost sensors (camera, IMMU and GNSS receiver) may be used in a coupling process to accurately estimate the pose (i.e. 3D position and 3D orientation) of a hand- held device. Additionally, the actual development and availability of 3D GIS content constitutes a mine of data usable for camera pose estimation. In the context of pedestrian navigation in urban environment, to update a Pedestrian Dead-Reckoning process and to improve the positioning accuracy, we propose to fuse the pose estimated through a vision process thanks to a precisely known 3D model with inertial and magnetic measurements. Experimental data collected in an urban environment, on a long pedestrian path with sparse known models permits to validate the benefit of sensors fusion process. This results in an improved positioning accuracy that enhances the Pedestrian Dead-Reckoning process and enables to display 3D information in Augmented Reality. Performance are presented in terms of positioning accuracy and compared to commonly used solutions.