On the SAGA algorithm with decreasing step. - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

On the SAGA algorithm with decreasing step.

Luis Fredes
Bernard Bercu
  • Fonction : Auteur
  • PersonId : 1175960

Résumé

Stochastic optimization naturally appear in many application areas, including machine learning. Our goal is to go further in the analysis of the Stochastic Average Gradient Accelerated (SAGA) algorithm. To achieve this, we introduce a new $\lambda$-SAGA algorithm which interpolates between the Stochastic Gradient Descent ($\lambda=0$) and the SAGA algorithm ($\lambda=1$). Firstly, we investigate the almost sure convergence of this new algorithm with decreasing step which allows us to avoid the restrictive strong convexity and Lipschitz gradient hypotheses associated to the objective function. Secondly, we establish a central limit theorem for the $\lambda$-SAGA algorithm. Finally, we provide the non-asymptotic $\bL^p$ rates of convergence.
Fichier principal
Vignette du fichier
main.pdf (515.4 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04716858 , version 1 (01-10-2024)

Identifiants

Citer

Luis Fredes, Bernard Bercu, Eméric Gbaguidi. On the SAGA algorithm with decreasing step.. 2024. ⟨hal-04716858⟩

Collections

CNRS IMB INSMI ANR
49 Consultations
26 Téléchargements

Altmetric

Partager

More