Échange de bruit corrélé pour le calcul distribué de moyenne avec garanties de confidentialité différentielle
Résumé
The amount of personal data collected in our everyday interactions with connected devices offers great opportunities for innovative services fueled by machine learning, as well as raises serious concerns for the privacy of individuals. In this paper, we propose a differentially private protocol allowing a large set of users to compute the average of their local values. In contrast to existing work, our protocol does not rely on a third party or costly cryptographic primitives: we use simple pairwise exchanges of correlated Gaussian noise along the edges of a network graph. We analyze the differential privacy guarantees of our protocol and the role of the correlated noise, and show that we can match the accuracy of the trusted curator model. Furthermore, we design a verification procedure based on additively homomorphic commitments which offers protection against malicious users joining the service with the goal of manipulating the outcome of the algorithm.