Comix: Joint Estimation and Lightspeed Comparison of Mixture Models - Archive ouverte HAL
Communication Dans Un Congrès Année : 2016

Comix: Joint Estimation and Lightspeed Comparison of Mixture Models

Résumé

The Kullback-Leibler divergence is a widespread dis-similarity measure between probability density functions , based on the Shannon entropy. Unfortunately, there is no analytic formula available to compute this divergence between mixture models, imposing the use of costly approximation algorithms. In order to reduce the computational burden when a lot of divergence evaluations are needed, we introduce a sub-class of the mixture models where the component parameters are shared between a set of mixtures and the only degree-of-freedom is the vector of weights of each mixture. This sharing allows to design extremely fast versions of existing dis-similarity measures between mixtures. We demonstrate the effectiveness of our approach by evaluating the quality of the ordering produced by our method on a real dataset.
Fichier principal
Vignette du fichier
icassp2016.pdf (213.48 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01367923 , version 1 (17-09-2016)

Identifiants

Citer

Olivier Schwander, Stéphane Marchand-Maillet, Frank Nielsen. Comix: Joint Estimation and Lightspeed Comparison of Mixture Models. ICASSP 2016, 2016, Shanghai, China. ⟨10.1109/ICASSP.2016.7472117⟩. ⟨hal-01367923⟩
443 Consultations
336 Téléchargements

Altmetric

Partager

More