Artemis: tight convergence guarantees for bidirectional compression in heterogeneous settings for federated learning - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2020

Artemis: tight convergence guarantees for bidirectional compression in heterogeneous settings for federated learning

Résumé

We introduce a framework - Artemis - to tackle the problem of learning in a distributed or federated setting with communication constraints. Several workers (randomly sampled) perform the optimization process using a central server to aggregate their computations. To alleviate the communication cost, Artemis allows to compress the information sent in both directions (from the workers to the server and conversely) combined with a memory mechanism. It improves on existing algorithms that only consider unidirectional compression (to the server), or use very strong assumptions on the compression operator. We provide fast rates of convergence (linear up to a threshold) under weak assumptions on the stochastic gradients (noise's variance bounded only at optimal point) in non-i.i.d. setting, highlight the impact of memory for unidirectional and bidirectional compression, and analyze Polyak-Ruppert averaging. We use convergence in distribution to obtain a lower bound of the asymptotic variance that highlights practical limits of compression.
Fichier principal
Vignette du fichier
2023_12-Artemis_HAL.pdf (1.47 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04350055 , version 1 (18-12-2023)

Identifiants

Citer

Constantin Philippenko, Aymeric Dieuleveut. Artemis: tight convergence guarantees for bidirectional compression in heterogeneous settings for federated learning. 2020. ⟨hal-04350055⟩
43 Consultations
35 Téléchargements

Altmetric

Partager

More