THE ROLE OF MUTUAL INFORMATION IN VARIATIONAL CLASSIFIERS - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Machine Learning Année : 2023

THE ROLE OF MUTUAL INFORMATION IN VARIATIONAL CLASSIFIERS

Résumé

This paper was accepted for publication to Machine Learning (Springer). Overfitting data is a well-known phenomenon related with the generation of a model that mimics too closely (or exactly) a particular instance of data, and may therefore fail to predict future observations reliably. In practice, this behaviour is controlled by various-sometimes based on heuristics-regularization techniques, which are motivated by upper bounds to the generalization error. In this work, we study the generalization error of classifiers relying on stochastic encodings which are trained on the cross-entropy loss, which is often used in deep learning for classification problems. We derive bounds to the generalization error showing that there exists a regime where the generalization error is bounded by the mutual information between input features and the corresponding representations in the latent space, which are randomly generated according to the encoding distribution. Our bounds provide an information-theoretic understanding of generalization in the so-called class of variational classifiers, which are regularized by a Kullback-Leibler (KL) divergence term. These results give theoretical grounds for the highly popular KL term in variational inference methods that was already recognized to act effectively as a regularization penalty. We further observe connections with well studied notions such as Variational Autoencoders, Information Dropout, Information Bottleneck and Boltzmann Machines. Finally, we perform numerical experiments on MNIST, CIFAR and other datasets and show that mutual information is indeed highly representative of the behaviour of the generalization error.
Fichier principal
Vignette du fichier
THE ROLE OF MUTUAL INFORMATION.pdf (1.09 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04137024 , version 1 (22-06-2023)

Identifiants

Citer

Matias Vera, Leonardo Rey Vega, Pablo Piantanida. THE ROLE OF MUTUAL INFORMATION IN VARIATIONAL CLASSIFIERS. Machine Learning, 2023, ⟨10.1007/s10994-023-06337-6⟩. ⟨hal-04137024⟩
10 Consultations
11 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More