Comix: Joint Estimation and Lightspeed Comparison of Mixture Models

Abstract : The Kullback-Leibler divergence is a widespread dis-similarity measure between probability density functions , based on the Shannon entropy. Unfortunately, there is no analytic formula available to compute this divergence between mixture models, imposing the use of costly approximation algorithms. In order to reduce the computational burden when a lot of divergence evaluations are needed, we introduce a sub-class of the mixture models where the component parameters are shared between a set of mixtures and the only degree-of-freedom is the vector of weights of each mixture. This sharing allows to design extremely fast versions of existing dis-similarity measures between mixtures. We demonstrate the effectiveness of our approach by evaluating the quality of the ordering produced by our method on a real dataset.
Type de document :
Communication dans un congrès
ICASSP 2016, 2016, Shanghai, China. ICASSP 2016, 2016, 〈10.1109/ICASSP.2016.7472117〉
Liste complète des métadonnées

Littérature citée [15 références]  Voir  Masquer  Télécharger

https://hal.sorbonne-universite.fr/hal-01367923
Contributeur : Olivier Schwander <>
Soumis le : samedi 17 septembre 2016 - 11:59:01
Dernière modification le : jeudi 7 février 2019 - 14:24:40
Document(s) archivé(s) le : dimanche 18 décembre 2016 - 15:44:40

Fichier

icassp2016.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Citation

Olivier Schwander, Stéphane Marchand-Maillet, Frank Nielsen. Comix: Joint Estimation and Lightspeed Comparison of Mixture Models. ICASSP 2016, 2016, Shanghai, China. ICASSP 2016, 2016, 〈10.1109/ICASSP.2016.7472117〉. 〈hal-01367923〉

Partager

Métriques

Consultations de la notice

494

Téléchargements de fichiers

164