Bayesian estimation of the Kullback-Leibler divergence for categorical sytems using mixtures of Dirichlet priors - Archive ouverte HAL
Article Dans Une Revue Physical Review E Année : 2023

Bayesian estimation of the Kullback-Leibler divergence for categorical sytems using mixtures of Dirichlet priors

Résumé

In many applications in biology, engineering and economics, identifying similarities and differences between distributions of data from complex processes requires comparing finite categorical samples of discrete counts. Statistical divergences quantify the difference between two distributions. However, their estimation is very difficult and empirical methods often fail, especially when the samples are small. We develop a Bayesian estimator of the Kullback-Leibler divergence between two probability distributions that makes use of a mixture of Dirichlet priors on the distributions being compared. We study the properties of the estimator on two examples: probabilities drawn from Dirichlet distributions, and random strings of letters drawn from Markov chains. We extend the approach to the squared Hellinger divergence. Both estimators outperform other estimation techniques, with better results for data with a large number of categories and for higher values of divergences.

Dates et versions

hal-04252253 , version 1 (20-10-2023)

Identifiants

Citer

Francesco Camaglia, Ilya Nemenman, Thierry Mora, Aleksandra Walczak. Bayesian estimation of the Kullback-Leibler divergence for categorical sytems using mixtures of Dirichlet priors. Physical Review E , 2023, 109 (2), pp.024305. ⟨10.1103/PhysRevE.109.024305⟩. ⟨hal-04252253⟩
16 Consultations
0 Téléchargements

Altmetric

Partager

More