Dimension-free convergence rates for gradient Langevin dynamics in RKHS - Archive ouverte HAL
Communication Dans Un Congrès Année : 2022

Dimension-free convergence rates for gradient Langevin dynamics in RKHS

Résumé

Gradient Langevin dynamics (GLD) and stochastic GLD (SGLD) have attracted considerable attention lately, as a way to provide convergence guarantees in a non-convex setting. However, the known rates grow exponentially with the dimension of the space under the dissipative condition. In this work, we provide a convergence analysis of GLD and SGLD when the optimization space is an infinite-dimensional Hilbert space. More precisely, we derive non-asymptotic, dimensionfree convergence rates for GLD/SGLD when performing regularized non-convex optimization in a reproducing kernel Hilbert space. Amongst others, the convergence analysis relies on the properties of a stochastic differential equation, its discrete time Galerkin approximation and the geometric ergodicity of the associated Markov chains.
Fichier principal
Vignette du fichier
muzellec22a.pdf (589.54 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03920387 , version 1 (03-01-2023)

Identifiants

  • HAL Id : hal-03920387 , version 1

Citer

Boris Muzellec, Kanji Sato, Mathurin Massias, Taiji Suzuki. Dimension-free convergence rates for gradient Langevin dynamics in RKHS. COLT 2022 - 35th Annual Conference on Learning Theory, Jul 2022, London, United Kingdom. ⟨hal-03920387⟩
39 Consultations
83 Téléchargements

Partager

More