PAVI: Plate-Amortized Variational Inference - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... Year :

PAVI: Plate-Amortized Variational Inference

Abstract

Given some observed data and a probabilistic generative model, Bayesian inference aims at obtaining the distribution of a model's latent parameters that could have yielded the data. This task is challenging for large population studies where thousands of measurements are performed over a cohort of hundreds of subjects, resulting in a massive latent parameter space. This large cardinality renders off-the-shelf Variational Inference (VI) computationally impractical. In this work, we design structured VI families that can efficiently tackle large population studies. To this end, our main idea is to share the parameterization and learning across the different i.i.d. variables in a generative model -symbolized by the model's plates. We name this concept plate amortization, and illustrate the powerful synergies it entitles, resulting in expressive, parsimoniously parameterized and orders of magnitude faster to train large scale hierarchical variational distributions. We illustrate the practical utility of PAVI through a challenging Neuroimaging example featuring a million latent parameters, demonstrating a significant step towards scalable and expressive Variational Inference.
Fichier principal
Vignette du fichier
PAVI_Neurips2022_preprint.pdf (4.22 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03684389 , version 1 (08-06-2022)

Identifiers

Cite

Louis Rouillard, Thomas Moreau, Demian Wassermann. PAVI: Plate-Amortized Variational Inference. 2022. ⟨hal-03684389⟩
41 View
26 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More