Statistical Guarantees for Variational Autoencoders using PAC-Bayesian Theory - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Statistical Guarantees for Variational Autoencoders using PAC-Bayesian Theory

Résumé

Since their inception, Variational Autoencoders (VAEs) have become central in machine learning. Despite their widespread use, numerous questions regarding their theoretical properties remain open. Using PAC-Bayesian theory, this work develops statistical guarantees for VAEs. First, we derive the first PAC-Bayesian bound for posterior distributions conditioned on individual samples from the data- generating distribution. Then, we utilize this result to develop generalization guarantees for the VAE’s reconstruction loss, as well as upper bounds on the distance between the input and the regenerated distributions. More importantly, we provide upper bounds on the Wasserstein distance between the input distribution and the distribution defined by the VAE’s generative model.
Fichier principal
Vignette du fichier
neurips_camera_2-2.pdf (1.12 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04233547 , version 1 (09-10-2023)

Identifiants

  • HAL Id : hal-04233547 , version 1

Citer

Sokhna Diarra Mbacke, Florence Clerc, Pascal Germain. Statistical Guarantees for Variational Autoencoders using PAC-Bayesian Theory. 37th Conference on Neural Information Processing Systems (NeurIPS 2023)., Dec 2023, New-Orleans, United States. ⟨hal-04233547⟩
131 Consultations
93 Téléchargements

Partager

More