Communication-Efficient Distributionally Robust Decentralized Learning - Archive ouverte HAL
Article Dans Une Revue Transactions on Machine Learning Research Journal Année : 2022

Communication-Efficient Distributionally Robust Decentralized Learning

Matteo Zecchin
Marios Kountouris
David Gesbert

Résumé

Decentralized learning algorithms empower interconnected devices to share data and computational resources to collaboratively train a machine learning model without the aid of a central coordinator. In the case of heterogeneous data distributions at the network nodes, collaboration can yield predictors with unsatisfactory performance for a subset of the devices. For this reason, in this work we consider the formulation of a distributionally robust decentralized learning task and we propose a decentralized single loop gradient descent/ascent algorithm (AD-GDA) to directly solve the underlying minimax optimization problem. We render our algorithm communication-efficient by employing a compressed consensus scheme and we provide convergence guarantees for smooth convex and non-convex loss functions. Finally, we corroborate the theoretical findings with empirical results that highlight AD-GDA ability to provide unbiased predictors and to greatly improve communication efficiency compared to existing distributionally robust algorithms.
Fichier principal
Vignette du fichier
473_communication_efficient_distri.pdf (1.72 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03926701 , version 1 (06-01-2023)

Licence

Identifiants

Citer

Matteo Zecchin, Marios Kountouris, David Gesbert. Communication-Efficient Distributionally Robust Decentralized Learning. Transactions on Machine Learning Research Journal, 2022. ⟨hal-03926701⟩
49 Consultations
27 Téléchargements

Altmetric

Partager

More