PAVI: Plate-Amortized Variational Inference - Archive ouverte HAL
Article Dans Une Revue Transactions on Machine Learning Research Journal Année : 2023

PAVI: Plate-Amortized Variational Inference

Résumé

Given some observed data and a probabilistic generative model, Bayesian inference aims at obtaining the distribution of a model's latent parameters that could have yielded the data. This task is challenging for large population studies where thousands of measurements are performed over a cohort of hundreds of subjects, resulting in a massive latent parameter space. This large cardinality renders off-the-shelf Variational Inference (VI) computationally impractical. In this work, we design structured VI families that can efficiently tackle large population studies. To this end, our main idea is to share the parameterization and learning across the different i.i.d. variables in a generative model -symbolized by the model's plates. We name this concept plate amortization, and illustrate the powerful synergies it entitles, resulting in expressive, parsimoniously parameterized and orders of magnitude faster to train large scale hierarchical variational distributions. We illustrate the practical utility of PAVI through a challenging Neuroimaging example featuring a million latent parameters, demonstrating a significant step towards scalable and expressive Variational Inference.
Fichier principal
Vignette du fichier
PAVI_Neurips2022_preprint.pdf (4.22 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03684389 , version 1 (08-06-2022)
hal-03684389 , version 2 (09-01-2024)

Licence

Identifiants

Citer

Louis Rouillard, Alexandre Le Bris, Thomas Moreau, Demian Wassermann. PAVI: Plate-Amortized Variational Inference. Transactions on Machine Learning Research Journal, 2023. ⟨hal-03684389v1⟩

Collections

GS-ENGINEERING
100 Consultations
97 Téléchargements

Altmetric

Partager

More