A Theoretical Analysis of Metric Hypothesis Transfer Learning - Archive ouverte HAL
Communication Dans Un Congrès Année : 2015

A Theoretical Analysis of Metric Hypothesis Transfer Learning

Michaël Perrot
  • Fonction : Auteur
  • PersonId : 957585
Amaury Habrard

Résumé

We consider the problem of transferring some a priori knowledge in the context of supervised metric learning approaches. While this setting has been successfully applied in some empirical contexts, no theoretical evidence exists to justify this approach. In this paper, we provide a theoretical justification based on the notion of algorithmic stability adapted to the regularized metric learning setting. We propose an on-average-replace-two-stability model allowing us to prove fast generalization rates when an auxiliary source metric is used to bias the regularizer. Moreover, we prove a consistency result from which we show the interest of considering biased weighted regularized formulations and we provide a solution to estimate the associated weight. We also present some experiments illustrating the interest of the approach in standard metric learning tasks and in a transfer learning problem where few labelled data are available.
Fichier non déposé

Dates et versions

hal-01175610 , version 1 (10-07-2015)

Identifiants

  • HAL Id : hal-01175610 , version 1

Citer

Michaël Perrot, Amaury Habrard. A Theoretical Analysis of Metric Hypothesis Transfer Learning. International Conference on Machine Learning, Jul 2015, Lille, France. ⟨hal-01175610⟩
79 Consultations
0 Téléchargements

Partager

More