Learning via Surrogate PAC-Bayes - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

Learning via Surrogate PAC-Bayes

Apprentissage PAC-Bayes via surrogate

Résumé

PAC-Bayes learning is a comprehensive setting for (i) studying the generalisation ability of learning algorithms and (ii) deriving new learning algorithms by optimising a generalisation bound. However, optimising generalisation bounds might not always be viable for tractable or computational reasons, or both. For example, iteratively querying the empirical risk might prove computationally expensive. In response, we introduce a novel principled strategy for building an iterative learning algorithm via the optimisation of a sequence of surrogate training objectives, inherited from PAC-Bayes generalisation bounds. The key argument is to replace the empirical risk (seen as a function of hypotheses) in the generalisation bound by its projection onto a constructible low dimensional functional space: these projections can be queried much more efficiently than the initial risk. On top of providing that generic recipe for learning via surrogate PAC-Bayes bounds, we (i) contribute theoretical results establishing that iteratively optimising our surrogates implies the optimisation of the original generalisation bounds, (ii) instantiate this strategy to the framework of meta-learning, introducing a meta-objective offering a closed form expression for meta-gradient, (iii) illustrate our approach with numerical experiments inspired by an industrial biochemical problem.
Fichier principal
Vignette du fichier
main.pdf (2.68 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04731841 , version 1 (11-10-2024)
hal-04731841 , version 2 (27-11-2024)

Identifiants

  • HAL Id : hal-04731841 , version 1

Citer

Antoine Picard-Weibel, Roman Moscoviz, Benjamin Guedj. Learning via Surrogate PAC-Bayes. Neurips 2024, Dec 2024, Vancouver, Canada. ⟨hal-04731841v1⟩
22 Consultations
6 Téléchargements

Partager

More