RKHS Weightings of Functions
Résumé
We examine a new general machine learning model for binary classification by considering
the expected weighted output of a parameterized predictor, where the weighting belongs
to an RKHS of functions over the parameters. This weighting can be learned using various
algorithms. One of them is Stochastic Functional Gradient Descent (SFGD), which itera-
tively samples a parameter and some training data, and calculates an approximation of the
functional gradient of the loss. Using the stability properties of the algorithm, we show that
convergence is guaranteed, under mild assumptions, with rate O(1/√m) on the number of
examples needed for learning. Further theoretical analysis, based on the Rademacher com-
plexity of the proposed class of predictors, provides a similar bound on the generalization
error. We also present three alternate learning algorithms, and a procedure for pruning the
model using the Lasso. We prove an error bound for the resulting sparse predictor. Finally,
we run experiments using simple instantiations of the model to showcase its usability, and
compare the learning algorithms among one another, and to the state of the art.
Origine | Fichiers produits par l'(les) auteur(s) |
---|