Non asymptotic analysis of Adaptive stochastic gradient algorithms and applications - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Non asymptotic analysis of Adaptive stochastic gradient algorithms and applications

Résumé

In stochastic optimization, a common tool to deal sequentially with large sample is to consider the well-known stochastic gradient algorithm. Nevertheless, since the stepsequence is the same for each direction, this can lead to bad results in practice in case of ill-conditionned problem. To overcome this, adaptive gradient algorithms such that Adagrad or Stochastic Newton algorithms should be prefered. This paper is devoted to the non asymptotic analyis of these adaptive gradient algorithms for strongly convex objective. All the theoretical results will be adapted to linear regression and regularized generalized linear model for both Adagrad and Stochastic Newton algorithms.
Fichier principal
Vignette du fichier
L2_adaptative_gradient.pdf (505.08 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04004305 , version 1 (27-02-2023)

Licence

Domaine public

Identifiants

Citer

Antoine Godichon-Baggioni, Pierre Tarrago. Non asymptotic analysis of Adaptive stochastic gradient algorithms and applications. 2023. ⟨hal-04004305⟩
72 Consultations
33 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More