Data Fusion Performance Evaluation for Range Measurements Combine with Cartesian ones for Road Obstacle Tracking
Résumé
This paper deals with evaluation of centralized fusion for two dissimilar sensors for the purpose of road obstacle tracking. The aim of sensor fusion is to produce an improved state estimate of a system from a set of independent data sources. Indeed, for a robust environment perception, see as obstacles here, several sensors should be installed in the equipped vehicle: camera, lidar, radar, etc. In our case, the motivation for this work comes from the need to track road targets with lidar measurements combined to radar ones. Thus, the aim is to combine effectively radar range measurements (i.e. range and range rate) with Lidar Cartesian measurements for a ”turn” scenario. Centralized fusion, i.e. measurement fusion, for two dissimilar sensors is considered here for evaluation. Evaluation is based on Cramer-Rao Lower Bound (CRLB) which is the basic tool for investigating estimation performance as it represents a limit of cognizability of the state. In the target tracking area, a recursive formulation of the Posterior Cramer- Rao Lower Bound (PCRLB) is used to analyze performance. Many bound comparisons are made according to used scenarios and various sensors configurations. Moreover, two algorithms for target motion analysis are developed and compared to the theoretical bounds of performance: the extended Kalman filter and the particle filter.