Correcting a Class of Complete Selection Bias with External Data Based on Importance Weight Estimation
Résumé
We present a practical bias correction method for classifier and regression models learning under a general class of selection bias. The method hinges on two assumptions: 1) a feature vector, Xs, exists such that S, the variable that controls the inclusion of the samples in the training set, is conditionally independent of (X, Y) given Xs; 2) one has access to some external samples drawn from the population as a whole in order to approximate the unbiased distribution of Xs. This general framework includes covariate shift and prior probability shift as special cases. We first show how importance weighting can remove this bias. We also discuss the case where our key assumption about Xs is not valid and where XS is only partially observed in the test set. Experimental results on synthetic and real-world data demonstrate that our method works well in practice.
Domaines
Intelligence artificielle [cs.AI]Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...