Acceleration of saddle-point methods in smooth cases
Résumé
In the present paper we propose a novel convergence analysis of the Alternating Direction Methods of Multipliers (ADMM), based on its equivalence with the overrelaxed Primal-Dual Hybrid Gradient (oPDHG) algorithm. We consider the smooth case, which correspond to the cas where the objective function can be decomposed into one differentiable with Lipschitz continuous gradient part and one strongly convex part. An accelerated variant of the ADMM is also proposed, which is shown to converge linearly with same rate as the oPDHG.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...