Convergence Rates of Non-Convex Stochastic Gradient Descent Under a Generic Łojasiewicz Condition and Local Smoothness - Archive ouverte HAL
Communication Dans Un Congrès Année : 2022

Convergence Rates of Non-Convex Stochastic Gradient Descent Under a Generic Łojasiewicz Condition and Local Smoothness

Kevin Scaman
  • Fonction : Auteur
  • PersonId : 1062981
Ludovic dos Santos

Résumé

Training over-parameterized neural networks involves the empirical minimization of highly nonconvex objective functions. Recently, a large body of works provided theoretical evidence that, despite this non-convexity, properly initialized over-parameterized networks can converge to a zero training loss through the introduction of the Polyak-Łojasiewicz condition. However, these analyses are restricted to quadratic losses such as mean square error, and tend to indicate fast exponential convergence rates that are seldom observed in practice. In this work, we propose to extend these results by analyzing stochastic gradient descent under more generic Łojasiewicz conditions that are applicable to any convex loss function, thus extending the current theory to a larger panel of losses commonly used in practice such as cross-entropy. Moreover, our analysis provides high-probability bounds on the approximation error under sub-Gaussian gradient noise and only requires the local smoothness of the objective function, thus making it applicable to deep neural networks in realistic settings.
Fichier principal
Vignette du fichier
scaman22a.pdf (862.25 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03896012 , version 1 (13-12-2022)

Identifiants

  • HAL Id : hal-03896012 , version 1

Citer

Kevin Scaman, Cédric Malherbe, Ludovic dos Santos. Convergence Rates of Non-Convex Stochastic Gradient Descent Under a Generic Łojasiewicz Condition and Local Smoothness. ICML 2022 - 39th International Conference on Machine Learning, Jul 2022, Baltimore, United States. ⟨hal-03896012⟩
65 Consultations
66 Téléchargements

Partager

More