Regularized Rényi divergence minimization through Bregman proximal gradient algorithms - Archive ouverte HAL Accéder directement au contenu
Rapport (Rapport De Recherche) Année : 2022

Regularized Rényi divergence minimization through Bregman proximal gradient algorithms

Résumé

We study the variational inference problem of minimizing a regularized Rényi divergence over an exponential family, and propose a relaxed moment-matching algorithm, which includes a proximal-like step. Using the information-geometric link between Bregman divergences and the Kullback-Leibler divergence, this algorithm is shown to be equivalent to a Bregman proximal gradient algorithm. This novel perspective allows us to exploit the geometry of our approximate model while using stochastic black-box updates. We use this point of view to prove strong convergence guarantees including monotonic decrease of the objective, convergence to a stationary point or to the minimizer, and convergence rates. These new theoretical insights lead to a versatile, robust, and competitive method, as illustrated by numerical experiments.
Fichier principal
Vignette du fichier
2211.04776.pdf (1.77 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03927834 , version 1 (06-01-2023)

Identifiants

  • HAL Id : hal-03927834 , version 1

Citer

Thomas Guilmeau, Emilie Chouzenoux, Víctor Elvira. Regularized Rényi divergence minimization through Bregman proximal gradient algorithms. Inria Saclay - Île de France. 2022. ⟨hal-03927834⟩
76 Consultations
86 Téléchargements

Partager

Gmail Facebook X LinkedIn More