Optimizing Markov Chain Monte Carlo Convergence with Normalizing Flows and Gibbs Sampling - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Optimizing Markov Chain Monte Carlo Convergence with Normalizing Flows and Gibbs Sampling

Résumé

Generative models have started to integrate into the scientific computing toolkit. One notable instance of this integration is the utilization of normalizing flows (NF) in the development of sampling and variational inference algorithms. This work introduces a novel algorithm, GflowMC, which relies on a Metropolis-within-Gibbs framework within the latent space of NFs. This approach addresses the challenge of vanishing acceptance probabilities often encountered when using NF-generated independent proposals, while retaining non-local updates, enhancing its suitability for sampling multi-modal distributions. We assess GflowMC's performance concentrating on the ϕ 4 model from statistical mechanics. Our results demonstrate that by identifying an optimal size for partial updates, convergence of the Markov Chain Monte Carlo (MCMC) can be achieved faster than with full updates. Additionally, we explore the adaptability of GflowMC for biasing proposals towards increasing the update frequency of critical coordinates, such as coordinates highly correlated to mode switching in multi-modal targets.
Fichier principal
Vignette du fichier
123_optimizing_markov_chain_monte_.pdf (728.64 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04404948 , version 1 (19-01-2024)

Identifiants

  • HAL Id : hal-04404948 , version 1

Citer

Christoph Schönle, Marylou Gabrié. Optimizing Markov Chain Monte Carlo Convergence with Normalizing Flows and Gibbs Sampling. NeurIPS 2023 AI for Science Workshop, Dec 2023, New Orleans, LA, USA, United States. ⟨hal-04404948⟩
83 Consultations
120 Téléchargements

Partager

More