Super-Acceleration with Cyclical Step-sizes - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Super-Acceleration with Cyclical Step-sizes

Résumé

Cyclical step-sizes are becoming increasingly popular in the optimization of deep learning problems. Motivated by recent observations on the spectral gaps of Hessians in machine learning, we show that these step-size schedules offer a simple way to exploit them. More precisely, we develop a convergence rate analysis for quadratic objectives that provides optimal parameters and shows that cyclical learning rates can improve upon traditional lower complexity bounds. We further propose a systematic approach to design optimal first order methods for quadratic minimization with a given spectral structure. Finally, we provide a local convergence rate analysis beyond quadratic minimization for the proposed methods and illustrate our findings through benchmarks on least squares and logistic regression problems.

Dates et versions

hal-03377367 , version 1 (14-10-2021)

Licence

Paternité

Identifiants

Citer

Baptiste Goujaud, Damien Scieur, Aymeric Dieuleveut, Adrien Taylor, Fabian Pedregosa. Super-Acceleration with Cyclical Step-sizes. International Conference on Artificial Intelligence and Statistics, Mar 2022, Virtual conference, France. ⟨hal-03377367⟩
47 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More