Correcting a Class of Complete Selection Bias with External Data Based on Importance Weight Estimation - Archive ouverte HAL
Communication Dans Un Congrès Année : 2015

Correcting a Class of Complete Selection Bias with External Data Based on Importance Weight Estimation

Van-Tinh Tran
Alex Aussem

Résumé

We present a practical bias correction method for classifier and regression models learning under a general class of selection bias. The method hinges on two assumptions: 1) a feature vector, Xs, exists such that S, the variable that controls the inclusion of the samples in the training set, is conditionally independent of (X, Y) given Xs; 2) one has access to some external samples drawn from the population as a whole in order to approximate the unbiased distribution of Xs. This general framework includes covariate shift and prior probability shift as special cases. We first show how importance weighting can remove this bias. We also discuss the case where our key assumption about Xs is not valid and where XS is only partially observed in the test set. Experimental results on synthetic and real-world data demonstrate that our method works well in practice.
Fichier principal
Vignette du fichier
Selection_bias_ICONIP2015.pdf (308.89 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01247394 , version 1 (22-12-2015)

Licence

Identifiants

Citer

Van-Tinh Tran, Alex Aussem. Correcting a Class of Complete Selection Bias with External Data Based on Importance Weight Estimation. 22nd International Conference, ICONIP 2015, Nov 2015, Istanbul, Turkey. ⟨10.1007/978-3-319-26555-1_13⟩. ⟨hal-01247394⟩
545 Consultations
396 Téléchargements

Altmetric

Partager

More