Convergence in sup-norm of least-squares estimators in regression with random design and nonparametric heteroscedastic noise, and its application to optimal model selection
Résumé
Recent advances in the theoretical analysis of optimality in model selection via penalization procedures, and more precisely concerning the validity of the Slope Heuristics, have led to investigate the consistency in sup-norm of M-estimators in order to derive controls of the excess risk and of the empirical excess risk of an M-estimator, that are optimal at the first order). Indeed, such controls are one of the keystones to justify the Slope Heuristics. The author has been able to show the consistency of least-squares estimators in an heteroscedastic with random design regression setting, on suitable linear models of histograms and piecewise polynomials. We investigate in the present paper a systematical approach of convergence in sup-norm for least-squares regression on finite dimensional linear models. We give general constraints on the structure of these models, that are sufficient to derive the consistency of the considered estimators. These constraints appear to be slightly more restrictive than the classical assumption of existence of an orthonormal localized basis of the model. Nevertheless, our approach allows to consider histograms and piecewise polynomials, but also, for example, some models of compactly supported wavelets, such as Haar expansions. Finally, our general result allows to stricly extend the previous theoretical justifications of the Slope Heuristics that have been achieved in the heteroscedastic regression framework.
Origine | Fichiers produits par l'(les) auteur(s) |
---|