Using Wasserstein-2 regularization to ensure fair decisions with Neural-Network classifiers - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2019

Using Wasserstein-2 regularization to ensure fair decisions with Neural-Network classifiers

Résumé

In this paper, we propose a new method to build fair Neural-Network classifiers by using a constraint based on the Wasserstein distance. More specifically, we detail how to efficiently compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed strategy is then used to train Neural-Networks decision rules which favor fair predictions. Our method fully takes into account two specificities of Neural-Networks training: (1) The network parameters are indirectly learned based on automatic differentiation and on the loss gradients, and (2) batch training is the gold standard to approximate the parameter gradients, as it requires a reasonable amount of computations and it can efficiently explore the parameters space. Results are shown on synthetic data, as well as on the UCI Adult Income Dataset. Our method is shown to perform well compared with 'ZafarICWWW17' and linear-regression with Wasserstein-1 regularization, as in 'JiangUAI19', in particular when non-linear decision rules are required for accurate predictions.

Dates et versions

hal-02271117 , version 1 (26-08-2019)

Identifiants

Citer

Laurent Risser, Quentin Vincenot, Nicolas Couellan, Jean-Michel Loubes. Using Wasserstein-2 regularization to ensure fair decisions with Neural-Network classifiers. 2019. ⟨hal-02271117⟩
198 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More