Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Streaming Data - Archive ouverte HAL
Article Dans Une Revue ESAIM: Probability and Statistics Année : 2023

Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Streaming Data

Résumé

We introduce a streaming framework for analyzing stochastic approximation/optimization problems. This streaming framework is analogous to solving optimization problems using time-varying mini-batches that arrive sequentially. We provide non-asymptotic convergence rates of various gradientbased algorithms; this includes the famous Stochastic Gradient (SG) descent (a.k.a. Robbins-Monro algorithm), mini-batch SG and time-varying mini-batch SG algorithms, as well as their iterated averages (a.k.a. Polyak-Ruppert averaging). We show (i) how to accelerate convergence by choosing the learning rate according to the time-varying mini-batches, (ii) that Polyak-Ruppert averaging achieves optimal convergence in terms of attaining the Cramer-Rao lower bound, and (iii) how time-varying mini-batches together with Polyak-Ruppert averaging can provide variance reduction and accelerate convergence simultaneously, which is advantageous for many learning problems, such as online, sequential, and large-scale learning. We further demonstrate these favorable effects for various time-varying minibatches.
Fichier principal
Vignette du fichier
ps220013.pdf (1.47 Mo) Télécharger le fichier
Origine Publication financée par une institution

Dates et versions

hal-04066897 , version 1 (12-04-2023)

Identifiants

Citer

Antoine Godichon-Baggioni, Nicklas Werge, Olivier Wintenberger. Non-Asymptotic Analysis of Stochastic Approximation Algorithms for Streaming Data. ESAIM: Probability and Statistics, 2023, 27, pp.482-514. ⟨10.1051/ps/2023006⟩. ⟨hal-04066897⟩
19 Consultations
33 Téléchargements

Altmetric

Partager

More