Bayesian Likelihood Free Inference using Mixtures of Experts - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

Bayesian Likelihood Free Inference using Mixtures of Experts

Résumé

We extend Bayesian Synthetic Likelihood (BSL) methods to non-Gaussian approximations of the likelihood function. In this setting, we introduce Mixtures of Experts (MoEs), a class of neural network models, as surrogate likelihoods that exhibit desirable approximation theoretic properties. Moreover, MoEs can be estimated using Expectation-Maximization algorithm-based approaches, such as the Gaussian Locally Linear Mapping model estimators that we implement. Further, we provide theoretical evidence towards the ability of our procedure to estimate and approximate a wide range of likelihood functions. Through simulations, we demonstrate the superiority of our approach over existing BSL variants in terms of both posterior approximation accuracy and computational efficiency.
Fichier principal
Vignette du fichier
soumis-main0.pdf (1.31 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04436187 , version 1 (03-02-2024)

Licence

Identifiants

  • HAL Id : hal-04436187 , version 1

Citer

Florence Forbes, Hien Duy Nguyen, Trungtin Nguyen. Bayesian Likelihood Free Inference using Mixtures of Experts. 2024. ⟨hal-04436187⟩
165 Consultations
161 Téléchargements

Partager

More