Bayesian nonparametric mixture of experts for high-dimensional inverse problems - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Bayesian nonparametric mixture of experts for high-dimensional inverse problems

Résumé

Large classes of problems can be formulated as inverse problems, where the goal is to find parameter values that best explain some observed measures. The relationship between parameters and observations is typically highly non-linear, with high-dimensional observations and correlated multidimensional parameters. To deal with these constraints via inverse regression strategies, we consider the Gaussian Local Linear Mapping (GLLiM) model, a special instance of mixture of expert models. We propose a general scheme to design a Bayesian nonparametric GLLiM model to avoid any commitment to an arbitrary number of experts. A tractable estimation algorithm is designed using variational Bayesian expectation-maximization. We establish posterior consistency for the number of mixture components after the merge-truncate-merge algorithm post-processing. Illustrations on simulated data show good results in terms of recovering the true number of experts and the mean regression function.
Fichier principal
Vignette du fichier
BNP_GLLiM_JNPS.pdf (1.06 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04015203 , version 1 (05-03-2023)
hal-04015203 , version 2 (01-11-2023)

Identifiants

  • HAL Id : hal-04015203 , version 2

Citer

Trungtin Nguyen, Florence Forbes, Julyan Arbel, Hien Duy Nguyen. Bayesian nonparametric mixture of experts for high-dimensional inverse problems. 2023. ⟨hal-04015203v2⟩
169 Consultations
146 Téléchargements

Partager

Gmail Facebook X LinkedIn More