Improved Variance Predictions in Approximate Message Passing
Résumé
In the Generalized Linear Model (GLM), the unknowns may be non-identically independent distributed (niid), as for instance in the Sparse Bayesian Learning (SBL) problem. The Generalized Approximate Message Passing (GAMP) algorithm performs computationally efficient belief propagation for Bayesian inference. The GAMP algorithms predicts the posterior variances correctly in the case of measurement matrices with (n)iid entries. In order to cover more illconditioned measurement matrices, the (right) rotationally invariant (RRI) model was introduced in which the (right) singular vectors are Haar distributed, leading to Vector AMP VAMP however assumes iid priors and posteriors. Here we introduce a convergent version of AMB (AMBAMP) applied to Unitarily transformed data, with a variance correction based on Haar Large System Analysis (LSA). The recently introduced reVAMP perspective shows that the resulting AM-BUAMP algorithm has an underlying multivariate Gaussian posterior approximation, that does not get computed but that allows the LSA. The individual variance predictions are exact asymptotically in the RRI setting, as illustrated by a Gaussian Mixture Model example.
Domaines
Informatique [cs]Origine | Fichiers produits par l'(les) auteur(s) |
---|