Mediated Uncoupled Learning and Validation with Bregman Divergences: Loss Family with Maximal Generality
Résumé
In mediated uncoupled learning (MU-learning), the goal is to predict an output variable Y given an input variable X as in ordinary supervised learning while the training dataset has no joint samples of (X, Y) but only independent samples of (X, U) and (U, Y) each observed with a mediating variable U. The existing MU-learning methods can only handle the squared loss, which prohibited the use of other popular loss functions such as the cross-entropy loss. We propose a general MU-learning framework that allows for the problems with Bregman divergences, which cover a wide range of loss functions useful for various types of tasks, in a unified manner. This loss family has maximal generality among those whose minimizers characterize the conditional expectation. We prove that the proposed objective function is a tighter approximation to the oracle loss that one would minimize if ordinary supervised samples of (X, Y) were available. We also propose an estimator of an interval containing the expected test loss of predictions of a trained model only using (X, U)and (U, Y)-data. We provide a theoretical analysis on the excess risk for the proposed method and confirm its practical usefulness with regression experiments with synthetic data and low-quality image classification experiments with benchmark datasets.
Domaines
Informatique [cs]Origine | Fichiers produits par l'(les) auteur(s) |
---|