Preserved central model for faster bidirectional compression in distributed settings - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

Preserved central model for faster bidirectional compression in distributed settings

Résumé

We develop a new approach to tackle communication constraints in a distributed learning problem with a central server. We propose and analyze a new algorithm that performs bidirectional compression and achieves the same convergence rate as algorithms using only uplink (from the local workers to the central server) compression. To obtain this improvement, we design MCM, an algorithm such that the downlink compression only impacts local models, while the global model is preserved. As a result, and contrary to previous works, the gradients on local servers are computed on perturbed models. Consequently, convergence proofs are more challenging and require a precise control of this perturbation. To ensure it, MCM additionally combines model compression with a memory mechanism. This analysis opens new doors, e.g. incorporating worker dependent randomized-models and partial participation.
Fichier principal
Vignette du fichier
2021-10-MCM_for_neurips-SM.pdf (1.1 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04255271 , version 1 (23-10-2023)

Identifiants

Citer

Constantin Philippenko, Aymeric Dieuleveut. Preserved central model for faster bidirectional compression in distributed settings. 35th Conference on Neural Information Processing Systems, Dec 2021, Virtual-only Conference, France. pp.2387-2399, ⟨10.48550/arXiv.2102.12528⟩. ⟨hal-04255271⟩
34 Consultations
24 Téléchargements

Altmetric

Partager

More