Testing conditional independence to determine shared information in a data/signal fusion process
Résumé
In this paper, we introduce hypothesis testing (HT) to validate the conditional independence hypothesis between sets of data, these data being modelled as continuous random variables. This HT is based on a pdf relationship and does not require any normality assumption for the data, for instance. In practice, the HT measures the entropic distance between products of probability densities. The statistics of the entropy estimates, in particular bias, variance and covariance, are extensively discussed in order to normalize the proposed statistical index. The results are discussed for three data and two signal sets, with Gaussian or non-Gaussian statistics. Our HT is also compared to the usual HT used to validate the conditional independence hypothesis.