Continuous Pose Estimation for Urban Pedestrian Applications on Hand-held Mobile Device
Résumé
To support pedestrian navigation in urban and indoor spaces, an accurate pose estimate (i.e. 3D position and 3D orientation) of an equipment held in hand constitutes an essential point in the development of mobility assistance tools (e.g. Augmented Reality applications). On the assumption that the pedestrian is only equipped with general public devices, the pose estimation is restricted to the use of low-cost sensors embedded in the latter (i.e. an Inertial and Magnetic Measurement Unit and a monocular camera). In addition, urban and indoor spaces, comprising closely-spaced buildings and ferromagnetic elements, constitute challenging areas for sensor pose estimation during large pedestrian displacements. However, the recent development and provision of 3D Geographical Information System content by cities constitutes a wealth of data usable for pose estimation. To address these challenges, we propose an autonomous sensor fusion framework for pedestrian hand-held device pose estimation in urban and indoor spaces. The proposed solution integrates inertial and magnetic-based attitude estimation, monocular Visual Odometry with pedestrian motion estimation for scale estimation and known 3D geospatial object recognitionbased absolute pose estimation. Firstly, this allows to continuously estimate a qualified pose of the device held in hand. Secondly, an absolute pose estimate enables to update and to improve the positioning accuracy. To assess the proposed solution, experimental data has been collected, for four different people, on a 0.5 km pedestrian walk in an urban space with sparse known objects and indoors passages. According to the performance evaluation, the sensors fusion process enhanced the pedestrian localization in areas where conventional hand-held systems were not accurate or available