Stochastic mirror descent for variationally coherent optimization problems - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

Stochastic mirror descent for variationally coherent optimization problems

Résumé

In this paper, we examine a class of non-convex stochastic optimization problems which we call \emph{variationally coherent}, and which properly includes pseudo-/quasi-convex and star-convex optimization problems. To solve such problems, we focus on the widely used \ac{SMD} family of algorithms (which contains stochastic gradient descent as a special case), and we show that the last iterate of \ac{SMD} converges to the problem's solution set with probability 1. This result contributes to the landscape of non-convex stochastic optimization by clarifying that neither pseudo-/quasi-convexity nor star-convexity is essential for (almost sure) global convergence; rather, variational coherence, a much weaker requirement, suffices. Characterization of convergence rates for the subclass of strongly variationally coherent optimization problems as well as simulation results are also presented.
Fichier non déposé

Dates et versions

hal-01643342 , version 1 (21-11-2017)

Identifiants

  • HAL Id : hal-01643342 , version 1

Citer

Zhengyuan Zhou, Panayotis Mertikopoulos, Nicholas Bambos, Stephen Boyd, Peter W. Glynn. Stochastic mirror descent for variationally coherent optimization problems. NIPS '17: Proceedings of the 31st International Conference on Neural Information Processing Systems, Dec 2017, Long Beach, CA, United States. ⟨hal-01643342⟩
262 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More