Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions - Archive ouverte HAL
Article Dans Une Revue Mathematics of Operations Research Année : 2024

Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

Résumé

In non-smooth stochastic optimization, we establish the non-convergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold $M$ where the function $f$ has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of $f$ is lower-bounded. We require two conditions on $f$. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a reinforced version of the projection formula of Bolte et al. for Whitney stratifiable functions, and which is of independent interest. The second assumption, termed the angle condition, allows to control the distance of the iterates to $M$. When $f$ is weakly convex, our assumptions are generic. Consequently, generically in the class of definable weakly convex functions, the SGD converges to a local minimizer.
Fichier principal
Vignette du fichier
tame.pdf (560.5 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03442137 , version 1 (23-11-2021)
hal-03442137 , version 2 (21-11-2022)
hal-03442137 , version 3 (31-07-2023)

Identifiants

Citer

Pascal Bianchi, Walid Hachem, Sholom Schechtman. Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions. Mathematics of Operations Research, 2024, 49 (3), pp.1761-1790. ⟨10.1287/moor.2021.0194⟩. ⟨hal-03442137v3⟩
297 Consultations
190 Téléchargements

Altmetric

Partager

More