Adaptive restart of accelerated gradient methods under local quadratic growth condition - Archive ouverte HAL
Article Dans Une Revue IMA Journal of Numerical Analysis Année : 2019

Adaptive restart of accelerated gradient methods under local quadratic growth condition

Résumé

By analyzing accelerated proximal gradient methods under a local quadratic growth condition, we show that restarting these algorithms at any frequency gives a globally linearly convergent algorithm. This result was previously known only for long enough frequencies. Then, as the rate of convergence depends on the match between the frequency and the quadratic error bound, we design a scheme to automatically adapt the frequency of restart from the observed decrease of the norm of the gradient mapping. Our algorithm has a better theoretical bound than previously proposed methods for the adaptation to the quadratic error bound of the objective. We illustrate the efficiency of the algorithm on a Lasso problem and on a regularized logistic regression problem.

Dates et versions

hal-02269132 , version 1 (22-08-2019)

Identifiants

Citer

Olivier Fercoq, Zheng Qu. Adaptive restart of accelerated gradient methods under local quadratic growth condition. IMA Journal of Numerical Analysis, 2019, ⟨10.1093/imanum/drz007⟩. ⟨hal-02269132⟩
31 Consultations
0 Téléchargements

Altmetric

Partager

More