Improved Stability and Generalization Guarantees of the Decentralized SGD Algorithm - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2024

Improved Stability and Generalization Guarantees of the Decentralized SGD Algorithm

Résumé

This paper presents a new generalization error analysis for Decentralized Stochastic Gradient Descent (D-SGD) based on algorithmic stability. The obtained results overhaul a series of recent works that suggested an increased instability due to decentralization and a detrimental impact of poorly-connected communication graphs on generalization. On the contrary, we show, for convex, strongly convex and non-convex functions, that D-SGD can always recover generalization bounds analogous to those of classical SGD, suggesting that the choice of graph does not matter. We then argue that this result is coming from a worst-case analysis, and we provide a refined optimization-dependent generalization bound for general convex functions. This new bound reveals that the choice of graph can in fact improve the worst-case bound in certain regimes, and that surprisingly, a poorly-connected graph can even be beneficial for generalization.
Fichier principal
Vignette du fichier
Stability_and_generalization_D_SGD.pdf (487.29 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04611418 , version 1 (13-06-2024)

Identifiants

Citer

Batiste Le Bars, Aurélien Bellet, Marc Tommasi, Kevin Scaman, Giovanni Neglia. Improved Stability and Generalization Guarantees of the Decentralized SGD Algorithm. ICML 2024 - The Forty-first International Conference on Machine Learning, Jul 2024, Vienne, Austria. ⟨hal-04611418⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More