Towards diffusion approximations for stochastic gradient descent without replacement - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... Year : 2022

Towards diffusion approximations for stochastic gradient descent without replacement

Abstract

Stochastic gradient descent without replacement or reshuffling (SGDo) is predominantly used to train machine learning models in practice. However, the mathematical theory of this algorithm remains underexplored compared to its "with replacement" and "infinite data" counterparts. We propose a stochastic, continuous-time approximation to SGDo based on a family of stochastic differential equations driven by a stochastic process we call epoched Brownian motion, which encapsulates the behavior of reusing the same sequence of data points in subsequent epochs. We investigate this diffusion approximation by considering an application of SGDo to linear regression. Explicit convergence results are derived for constant learning rates and a sequence of learning rates satisfying the Robbins-Monro conditions. Finally, the validity of continuous-time dynamics are further substantiated by numerical experiments.
Fichier principal
Vignette du fichier
TowardsDiffusionForSGDo_v1.pdf (463.06 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03527878 , version 1 (16-01-2022)

Identifiers

  • HAL Id : hal-03527878 , version 1

Cite

Stefan Ankirchner, Stefan Perko. Towards diffusion approximations for stochastic gradient descent without replacement. 2022. ⟨hal-03527878⟩
126 View
141 Download

Share

Gmail Mastodon Facebook X LinkedIn More