Deep Learning with Kernels through RKHM and the Perron-Frobenius Operator - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Deep Learning with Kernels through RKHM and the Perron-Frobenius Operator

Yuka Hashimoto
  • Fonction : Auteur
Masahiro Ikeda
  • Fonction : Auteur
Hachem Kadri

Résumé

Reproducing kernel Hilbert $C^*$-module (RKHM) is a generalization of reproducing kernel Hilbert space (RKHS) by means of $C^*$-algebra, and the Perron-Frobenius operator is a linear operator related to the composition of functions. Combining these two concepts, we present deep RKHM, a deep learning framework for kernel methods. We derive a new Rademacher generalization bound in this setting and provide a theoretical interpretation of benign overfitting by means of Perron-Frobenius operators. By virtue of $C^*$-algebra, the dependency of the bound on output dimension is milder than existing bounds. We show that $C^*$-algebra is a suitable tool for deep learning with kernels, enabling us to take advantage of the product structure of operators and to provide a clear connection with convolutional neural networks. Our theoretical analysis provides a new lens through which one can design and analyze deep kernel methods.
Fichier principal
Vignette du fichier
NeurIPS-2023-deep-learning-with-kernels-through-rkhm-and-the-perron-frobenius-operator-Paper-Conference.pdf (679.94 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04523927 , version 1 (27-03-2024)

Identifiants

Citer

Yuka Hashimoto, Masahiro Ikeda, Hachem Kadri. Deep Learning with Kernels through RKHM and the Perron-Frobenius Operator. Advances in Neural Information Processing Systems (NeurIPS), Dec 2023, New Orleans (LA), United States. ⟨hal-04523927⟩
9 Consultations
21 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More