Classification and feature selection using a primal-dual method and projection on structured constraints
Résumé
This paper concerns feature selection using supervised classification on high dimensional datasets. The classical approach is to project data onto a low dimensional space and classify by minimizing an appropriate quadratic cost. We first introduced a matrix of centers in the definition of this cost. Moreover, as quadratic costs are not robust to outliers, we propose instead to use an 1 cost (or Huber loss to mitigate overfitting issues). While control on sparsity is commonly obtained by adding an 1 constraint on the vectorized matrix of weights used for projecting the data, we propose to enforce structured sparsity. To this end we used constraints that take into account the matrix structure of the data, based either on the nuclear norm, on the 2,1 norm, or on the 1,2 norm for which we provide a new projection algorithm. We optimize simultaneously the projection matrix and the matrix of centers with a new tailored constrained primaldual method. The primal-dual framework is general enough to encompass the various robust losses and structured constraints we use, and allows a convergence analysis. We demonstrate the effectiveness of this approach on three biological datasets. Our primal-dual method with robust losses, adaptive centers and structured constraints does significantly better than classical methods, both in terms of accuracy and computational time.
Domaines
Machine Learning [stat.ML]Origine | Fichiers produits par l'(les) auteur(s) |
---|