Federated Expectation Maximization with heterogeneity mitigation and variance reduction - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

Federated Expectation Maximization with heterogeneity mitigation and variance reduction

Résumé

The Expectation Maximization (EM) algorithm is the default algorithm for inference in latent variable models. As in any other field of machine learning, applications of latent variable models to very large datasets make the use of advanced parallel and distributed architectures mandatory. This paper introduces FedEM, which is the first extension of the EM algorithm to the federated learning context. FedEM is a new communication efficient method, which handles partial participation of local devices, and is robust to heterogeneous distributions of the datasets. To alleviate the communication bottleneck, FedEM compresses appropriately defined complete data sufficient statistics. We also develop and analyze an extension of FedEM to further incorporate a variance reduction scheme. In all cases, we derive finite-time complexity bounds for smooth non-convex problems. Numerical results are presented to support our theoretical findings, as well as an application to federated missing values imputation for biodiversity monitoring.
Fichier principal
Vignette du fichier
FedEMfinal_onHAL.pdf (869.13 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03333516 , version 1 (03-09-2021)
hal-03333516 , version 2 (02-11-2021)
hal-03333516 , version 3 (09-11-2021)

Identifiants

Citer

Aymeric Dieuleveut, Gersende Fort, Eric Moulines, Geneviève Robin. Federated Expectation Maximization with heterogeneity mitigation and variance reduction. NeurIPS 2021 - 35th Conference on Neural Information Processing Systems, Dec 2021, Sydney, Australia. ⟨10.48550/arXiv.2111.02083⟩. ⟨hal-03333516v3⟩
365 Consultations
335 Téléchargements

Altmetric

Partager

More