Mixture weights optimisation for alpha-divergence variational inference - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Mixture weights optimisation for alpha-divergence variational inference

Résumé

This paper focuses on $\alpha$-divergence minimisation methods for Variational Inference. We consider the case where the posterior density is approximated by a mixture model and we investigate algorithms optimising the mixture weights of this mixture model by $\alpha$-divergence minimisation, without any information on the underlying distribution of its mixture components parameters. The Power Descent, defined for all $\alpha \neq 1$, is one such algorithm and we establish in our work the full proof of its convergence towards the optimal mixture weights when $\alpha < 1$. Since the $\alpha$-divergence recovers the widely-used exclusive Kullback-Leibler when $\alpha \to 1$, we then extend the Power Descent to the case $\alpha = 1$ and show that we obtain an Entropic Mirror Descent. This leads us to investigate the link between Power Descent and Entropic Mirror Descent: first-order approximations allow us to introduce the Rényi Descent, a novel algorithm for which we prove an $O(1/N)$ convergence rate. Lastly, we compare numerically the behavior of the unbiased Power Descent and of the biased Rényi Descent and we discuss the potential advantages of one algorithm over the other.
Fichier principal
Vignette du fichier
NeurIPS-2021-mixture-weights-optimisation-for-alpha-divergence-variational-inference-Paper.pdf (602.75 Ko) Télécharger le fichier
Origine Fichiers éditeurs autorisés sur une archive ouverte
Licence

Dates et versions

hal-04083262 , version 1 (27-04-2023)

Licence

Identifiants

Citer

Kamélia Daudel, Randal Douc. Mixture weights optimisation for alpha-divergence variational inference. Advances in Neural Information Processing (NeurIPS), Dec 2021, Online, France. pp.4397--4408, ⟨10.48550/arXiv.2106.05114⟩. ⟨hal-04083262⟩
23 Consultations
30 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More