Distributed Collapsed Gibbs Sampler for Dirichlet Process Mixture Models in Federated Learning - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

Distributed Collapsed Gibbs Sampler for Dirichlet Process Mixture Models in Federated Learning

Résumé

Dirichlet Process Mixture Models (DPMMs) are widely used to address clustering problems. Their main advantage lies in their ability to automatically estimate the number of clusters during the inference process through the Bayesian non-parametric framework. However, the inference becomes considerably slow as the dataset size increases. This paper proposes a new distributed Markov Chain Monte Carlo (MCMC) inference method for DPMMs (DisCGS) using sufficient statistics. Our approach uses the collapsed Gibbs sampler and is specifically designed to work on distributed data across independent and heterogeneous machines, which habilitates its use in horizontal federated learning. Our method achieves highly promising results and notable scalability. For instance, with a dataset of 100K data points, the centralized algorithm requires approximately 12 hours to complete 100 iterations while our approach achieves the same number of iterations in just 3 minutes, reducing the execution time by a factor of 200 without compromising clustering performance. The code source is publicly available at https://github. com/redakhoufache/DisCGS.
Fichier principal
Vignette du fichier
paper2.pdf (791.75 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04457596 , version 1 (14-02-2024)

Identifiants

Citer

Reda Khoufache, Mustapha Lebbah, Hanene Azzag, Etienne Goffinet, Djamel Bouchaffra. Distributed Collapsed Gibbs Sampler for Dirichlet Process Mixture Models in Federated Learning. SIAM International Conference on Data Mining (SDM24), Apr 2024, Houston, Texas, United States. pp.815 - 823, ⟨10.1137/1.9781611978032.93⟩. ⟨hal-04457596⟩
380 Consultations
102 Téléchargements

Altmetric

Partager

More