Fast convex optimization via closed-loop time scaling of gradient dynamics
Résumé
In a Hilbert setting, for convex differentiable optimization, we develop a general framework for adaptive accelerated gradient methods. They are based on damped inertial dynamics where the coefficients are designed in a closed-loop way. Specifically, the damping is a feedback control of the velocity, or of the gradient of the objective function. For this, we develop a closed-loop version of the time scaling and averaging technique introduced by the authors. We thus obtain autonomous inertial dynamics which involve vanishing viscous damping and implicit Hessian driven damping. By simply using the convergence rates for the continuous steepest descent and Jensen's inequality, without the need for further Lyapunov analysis, we show that the trajectories have several remarkable properties at once: they ensure fast convergence of values, fast convergence of the gradients towards zero, and they converge to optimal solutions. Our approach leads to parallel algorithmic results, that we study in the case of proximal algorithms. These are among the very first general results of this type obtained using autonomous dynamics.
Mots clés
fast convex optimization damped inertial dynamic time scaling averaging closed-loop control Nesterov and Ravine algorithms Hessian driven damping proximal algorithms AMS subject classification 37N40
46N10
49M30
65B99
65K05
65K10
90B50
90C25
fast convex optimization
damped inertial dynamic
time scaling
averaging
closed-loop control
Nesterov and Ravine algorithms
Hessian driven damping
proximal algorithms AMS subject classification 37N40
Domaines
Optimisation et contrôle [math.OC]Origine | Fichiers produits par l'(les) auteur(s) |
---|