Adaptive importance sampling for heavy-tailed distributions via α-divergence minimization
Résumé
Adaptive importance sampling (AIS) algorithms are widely used to approximate moments of target probability distributions. When the target has heavy tails, existing AIS algorithms can provide inconsistent estimators or exhibit slow convergence, as they often neglect the target’s tail behaviour. To avoid this pitfall, we propose an AIS algorithm that approximates the target by Student-t proposal distributions. We adapt location and scale parameters by matching the escort moments (defined even for heavy-tailed distributions) of the target and proposal. The resulting updates minimize the α-divergence between the target and the proposal, thereby connecting with variational inference methods. We then show that the α-divergence can be approximated by a generalized notion of effective sample size. We leverage this new perspective to adapt the proposal tail parameter using Bayesian optimization. We demonstrate the efficacy of our approach through applications to synthetic targets and a Bayesian Student-t regression task on real clinical trial data.
Domaines
Statistiques [math.ST]Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
Licence |