Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2022

Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

Résumé

In non-smooth stochastic optimization, we establish the non-convergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold $M$ where the function $f$ has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of $f$ is lower-bounded. We require two conditions on $f$. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a strengthened version of the projection formula of Bolte \emph{et.al.} for Whitney stratifiable functions, and which is of independent interest. The second assumption, termed the angle condition, allows to control the distance of the iterates to $M$. When $f$ is weakly convex, our assumptions are generic. Consequently, generically in the class of definable weakly convex functions, SGD converges to a local minimizer.
Fichier principal
Vignette du fichier
revised.pdf (537.91 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03442137 , version 1 (23-11-2021)
hal-03442137 , version 2 (21-11-2022)
hal-03442137 , version 3 (31-07-2023)

Identifiants

  • HAL Id : hal-03442137 , version 2

Citer

Pascal Bianchi, Walid Hachem, Sholom Schechtman. Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions. 2022. ⟨hal-03442137v2⟩
297 Consultations
190 Téléchargements

Partager

More