Constrained Differentially Private Federated Learning for Low-bandwidth Devices - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

Constrained Differentially Private Federated Learning for Low-bandwidth Devices

Résumé

Federated learning becomes a prominent approach when different entities want to learn collaboratively a common model without sharing their training data. However, Federated learning has two main drawbacks. First, it is quite bandwidth inefficient as it involves a lot of message exchanges between the aggregating server and the participating entities. This bandwidth and corresponding processing costs could be prohibitive if the participating entities are, for example, mobile devices. Furthermore, although federated learning improves privacy by not sharing data, recent attacks have shown that it still leaks information about the training data. This paper presents a novel privacy-preserving federated learning scheme. The proposed scheme provides theoretical privacy guarantees, as it is based on Differential Privacy. Furthermore, it optimizes the model accuracy by constraining the model learning phase on few selected weights. Finally, as shown experimentally, it reduces the upstream and downstream bandwidth by up to 99.9% compared to standard federated learning, making it practical for mobile systems.
Fichier principal
Vignette du fichier
Top_K_DP.pdf (286.43 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03266004 , version 1 (21-06-2021)

Identifiants

  • HAL Id : hal-03266004 , version 1

Citer

Raouf Kerkouche, Gergely Ács, Claude Castelluccia, Pierre Genevès. Constrained Differentially Private Federated Learning for Low-bandwidth Devices. UAI 2021 - 37th Conference on Uncertainty in Artificial Intelligence, Jul 2021, Online, United States. pp.1-18. ⟨hal-03266004⟩
110 Consultations
117 Téléchargements

Partager

More