A General Theory for Federated Optimization with Asynchronous and Heterogeneous Clients Updates - Archive ouverte HAL
Article Dans Une Revue Journal of Machine Learning Research Année : 2023

A General Theory for Federated Optimization with Asynchronous and Heterogeneous Clients Updates

Résumé

We propose a novel framework to study asynchronous federated learning optimization with delays in gradient updates. Our theoretical framework extends the standard FedAvg aggregation scheme by introducing stochastic aggregation weights to represent the variability of the clients update time, due for example to heterogeneous hardware capabilities. Our formalism applies to the general federated setting where clients have heterogeneous datasets and perform at least one step of stochastic gradient descent (SGD). We demonstrate convergence for such a scheme and provide sufficient conditions for the related minimum to be the optimum of the federated problem. We show that our general framework applies to existing optimization schemes including centralized learning, FedAvg, asynchronous FedAvg, and FedBuff. The theory here provided allows drawing meaningful guidelines for designing a federated learning experiment in heterogeneous conditions. In particular, we develop in this work FedFix, a novel extension of FedAvg enabling efficient asynchronous federated training while preserving the convergence stability of synchronous aggregation. We empirically demonstrate our theory on a series of experiments showing that asynchronous FedAvg leads to fast convergence at the expense of stability, and we finally demonstrate the improvements of FedFix over synchronous and asynchronous FedAvg.
Fichier principal
Vignette du fichier
A_General_Theory_for_Federated_Optimization_with_Delayed_Gradients_and_Heterogeneous_Data.pdf (1.32 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03720629 , version 1 (12-07-2022)

Identifiants

  • HAL Id : hal-03720629 , version 1

Citer

Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi. A General Theory for Federated Optimization with Asynchronous and Heterogeneous Clients Updates. Journal of Machine Learning Research, 2023, 24, pp.1-43. ⟨hal-03720629⟩
68 Consultations
118 Téléchargements

Partager

More