A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue SIAM Journal on Optimization Année : 2020

A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm

Résumé

We develop and analyze an asynchronous algorithm for distributed convex optimization when the objective writes a sum of smooth functions, local to each worker, and a non-smooth function. Unlike many existing methods, our distributed algorithm is adjustable to various levels of communication cost, delays, machines computational power, and functions smoothness. A unique feature is that the stepsizes do not depend on communication delays nor number of machines, which is highly desirable for scalability. We prove that the algorithm converges linearly in the strongly convex case, and provide guarantees of convergence for the non-strongly convex case. The obtained rates are the same as the vanilla proximal gradient algorithm over some introduced epoch sequence that subsumes the delays of the system. We provide numerical results on large-scale machine learning problems to demonstrate the merits of the proposed method.
Fichier principal
Vignette du fichier
Dist_Prox_Grad.pdf (1.37 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01821683 , version 1 (22-06-2018)
hal-01821683 , version 2 (17-11-2020)

Identifiants

Citer

Konstantin Mishchenko, Franck Iutzeler, Jérôme Malick. A Distributed Flexible Delay-tolerant Proximal Gradient Algorithm. SIAM Journal on Optimization, 2020, 30 (1), pp.933-959. ⟨10.1137/18M1194699⟩. ⟨hal-01821683v2⟩
156 Consultations
109 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More