Variational excess risk bound for general state space models - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Variational excess risk bound for general state space models

Résumé

In this paper, we consider variational autoencoders (VAE) for general state space models. We consider a backward factorization of the variational distributions to analyze the excess risk associated with VAE. Such backward factorizations were recently proposed to perform online variational learning and to obtain upper bounds on the variational estimation error. When independent trajectories of sequences are observed and under strong mixing assumptions on the state space model and on the variational distribution, we provide an oracle inequality explicit in the number of samples and in the length of the observation sequences. We then derive consequences of this theoretical result. In particular, when the data distribution is given by a state space model, we provide an upper bound for the Kullback-Leibler divergence between the data distribution and its estimator and between the variational posterior and the estimated state space posterior distributions. Under classical assumptions, we prove that our results can be applied to Gaussian backward kernels built with dense and recurrent neural networks.
Fichier principal
Vignette du fichier
VAEhmm_arXiv.pdf (291.5 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04344697 , version 1 (14-12-2023)

Identifiants

Citer

Élisabeth Gassiat, Sylvain Le Corff. Variational excess risk bound for general state space models. 2023. ⟨hal-04344697⟩
13 Consultations
8 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More