Towards diffusion approximations for stochastic gradient descent without replacement - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Towards diffusion approximations for stochastic gradient descent without replacement

Résumé

Stochastic gradient descent without replacement or reshuffling (SGDo) is predominantly used to train machine learning models in practice. However, the mathematical theory of this algorithm remains underexplored compared to its "with replacement" and "infinite data" counterparts. We propose a stochastic, continuous-time approximation to SGDo based on a family of stochastic differential equations driven by a stochastic process we call epoched Brownian motion, which encapsulates the behavior of reusing the same sequence of data points in subsequent epochs. We investigate this diffusion approximation by considering an application of SGDo to linear regression. Explicit convergence results are derived for constant learning rates and a sequence of learning rates satisfying the Robbins-Monro conditions. Finally, the validity of continuous-time dynamics are further substantiated by numerical experiments.
Fichier principal
Vignette du fichier
TowardsDiffusionForSGDo_v1.pdf (463.06 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03527878 , version 1 (16-01-2022)

Identifiants

  • HAL Id : hal-03527878 , version 1

Citer

Stefan Ankirchner, Stefan Perko. Towards diffusion approximations for stochastic gradient descent without replacement. 2022. ⟨hal-03527878⟩
124 Consultations
133 Téléchargements

Partager

Gmail Mastodon Facebook X LinkedIn More