Stochastic Gradient Descent Revisited - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

Stochastic Gradient Descent Revisited

Résumé

Stochastic gradient descent (SGD) algorithms have been a go-to solution for nonconvex stochastic optimization problems arising in machine learning. Their theory however often requires a strong framework to guarantee convergence properties. We hereby present a full scope convergence study of nonconvex SGD, including weak convergence, functionvalue convergence and global convergence, and also provide subsequent convergence rates and complexities, all under relatively mild conditions in comparison with literature.
Fichier principal
Vignette du fichier
main.pdf (700.63 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04808083 , version 1 (27-11-2024)
hal-04808083 , version 2 (08-12-2024)

Identifiants

  • HAL Id : hal-04808083 , version 1

Citer

Azar Louzi. Stochastic Gradient Descent Revisited. 2024. ⟨hal-04808083v1⟩
0 Consultations
0 Téléchargements

Partager

More