PCA-Kernel Estimation
Résumé
Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample $\bX_1, \hdots, \bX_n$ onto the first $D$ eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector $\hat \Pi_D$. Classical nonparametric inference methods such as kernel density estimation or kernel regression analysis are then performed in the (usually small) $D$-di\-men\-sio\-nal space. However, the mathematical analysis of this data-driven dimension reduction scheme raises technical problems, due to the fact that the random variables of the projected sample $( \hat \Pi_D\bX_1,\hdots, \hat \Pi_D\bX_n )$ are no more independent. As a reference for further studies, we offer in this paper several results showing the asymptotic equivalencies between important kernel-related quantities based on the empirical projector and its theoretical counterpart. As an illustration, we provide an in-depth analysis of the nonparametric kernel regression case
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...