Central Limit Theorem for Bayesian Neural Network trained with Variational Inference - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

Central Limit Theorem for Bayesian Neural Network trained with Variational Inference

Arnaud Descours
  • Fonction : Auteur
  • PersonId : 1388672
Tom Huix
  • Fonction : Auteur
  • PersonId : 1268148
Manon Michel
Éric Moulines
Boris Nectoux
  • Fonction : Auteur
  • PersonId : 1151875

Résumé

In this paper, we rigorously derive Central Limit Theorems (CLT) for Bayesian two-layer neural networks in the infinite-width limit and trained by variational inference on a regression task. The different networks are trained via different maximization schemes of the regularized evidence lower bound: (i) the idealized case with exact estimation of a multiple Gaussian integral from the reparametrization trick, (ii) a minibatch scheme using Monte Carlo sampling, commonly known as Bayes-by-Backprop, and (iii) a computationally cheaper algorithm named Minimal VI. The latter was recently introduced by leveraging the information obtained at the level of the mean-field limit. Laws of large numbers are already rigorously proven for the three schemes that admits the same asymptotic limit. By deriving CLT, this work shows that the idealized and Bayes-by-Backprop schemes have similar fluctuation behavior, that is different from the Minimal VI one. Numerical experiments then illustrate that the Minimal VI scheme is still more efficient, in spite of bigger variances, thanks to its important gain in computational complexity.
Fichier principal
Vignette du fichier
clt_bnn_arxiv.pdf (697.67 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04599502 , version 1 (07-06-2024)

Identifiants

Citer

Arnaud Descours, Tom Huix, Arnaud Guillin, Manon Michel, Éric Moulines, et al.. Central Limit Theorem for Bayesian Neural Network trained with Variational Inference. 2024. ⟨hal-04599502⟩
92 Consultations
141 Téléchargements

Altmetric

Partager

More