Two-sided Matrix Regression
Résumé
The two-sided matrix regression model Y = A * XB * + E aims at predicting Y by taking into account both linear links between column features of X, via the unknown matrix B * , and also among the row features of X, via the matrix A *. We propose lowrank predictors in this high-dimensional matrix regression model via rank-penalized and nuclear norm-penalized least squares. Both criteria are non jointly convex; however, we propose explicit predictors based on SVD and show optimal prediction bounds. We give sufficient conditions for consistent rank selector. We also propose a fully data-driven rank-adaptive procedure. Simulation results confirm the good prediction and the rankconsistency results under data-driven explicit choices of the tuning parameters and the scaling parameter of the noise.
Origine | Fichiers produits par l'(les) auteur(s) |
---|