From Noisy Fixed-Point Iterations to Private ADMM for Centralized and Federated Learning - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

From Noisy Fixed-Point Iterations to Private ADMM for Centralized and Federated Learning

Résumé

We study differentially private (DP) machine learning algorithms as instances of noisy fixed-point iterations, in order to derive privacy and utility results from this well-studied framework. We show that this new perspective recovers popular private gradient-based methods like DP-SGD and provides a principled way to design and analyze new private optimization algorithms in a flexible manner. Focusing on the widely-used Alternating Directions Method of Multipliers (ADMM) method, we use our general framework to derive novel private ADMM algorithms for centralized, federated and fully decentralized learning. For these three algorithms, we establish strong privacy guarantees leveraging privacy amplification by iteration and by subsampling. Finally, we provide utility guarantees using a unified analysis that exploits a recent linear convergence result for noisy fixed-point iterations.

Dates et versions

hal-04260417 , version 1 (26-10-2023)

Licence

Paternité

Identifiants

Citer

Edwige Cyffers, Aurélien Bellet, Debabrota Basu. From Noisy Fixed-Point Iterations to Private ADMM for Centralized and Federated Learning. Proceedings of the 40th International Conference on Machine Learning (ICML), Jul 2023, Honolulu, United States. ⟨hal-04260417⟩
25 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More