Differentially Private Gradient Flow based on the Sliced Wasserstein Distance - Archive ouverte HAL
Pré-Publication, Document De Travail Année : 2024

Differentially Private Gradient Flow based on the Sliced Wasserstein Distance

Résumé

Safeguarding privacy in sensitive training data is paramount, particularly in the context of generative modeling. This can be achieved through either differentially private stochastic gradient descent or a differentially private metric for training models or generators. In this paper, we introduce a novel differentially private generative modeling approach based on a gradient flow in the space of probability measures. To this end, we define the gradient flow of the Gaussian-smoothed Sliced Wasserstein Distance, including the associated stochastic differential equation (SDE). By discretizing and defining a numerical scheme for solving this SDE, we demonstrate the link between smoothing and differential privacy based on a Gaussian mechanism, due to a specific form of the SDE's drift term. We then analyze the differential privacy guarantee of our gradient flow, which accounts for both the smoothing and the Wiener process introduced by the SDE itself. Experiments show that our proposed model can generate higher-fidelity data at a low privacy budget compared to a generator-based model, offering a promising alternative.
Fichier principal
Vignette du fichier
DP_SW_GF (2).pdf (4.81 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04664174 , version 1 (29-07-2024)

Identifiants

  • HAL Id : hal-04664174 , version 1

Citer

Ilana Sebag, Muni Sreenivas, Jean-Yves Franceschi, Alain Rakotomamonjy, Mike Gartrell, et al.. Differentially Private Gradient Flow based on the Sliced Wasserstein Distance. 2024. ⟨hal-04664174⟩
55 Consultations
45 Téléchargements

Partager

More