Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... Year :

Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

Abstract

In non-smooth stochastic optimization, we establish the non-convergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold $M$ where the function $f$ has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of $f$ is lower-bounded. We require two conditions on $f$. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a strengthened version of the projection formula of Bolte \emph{et.al.} for Whitney stratifiable functions, and which is of independent interest. The second assumption, termed the angle condition, allows to control the distance of the iterates to $M$. When $f$ is weakly convex, our assumptions are generic. Consequently, generically in the class of definable weakly convex functions, SGD converges to a local minimizer.
Fichier principal
Vignette du fichier
revised.pdf (537.91 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03442137 , version 1 (23-11-2021)
hal-03442137 , version 2 (21-11-2022)

Identifiers

  • HAL Id : hal-03442137 , version 2

Cite

Pascal Bianchi, Walid Hachem, Sholom Schechtman. Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions. 2022. ⟨hal-03442137v2⟩
173 View
30 Download

Share

Gmail Facebook Twitter LinkedIn More