New M-estimators in semiparametric regression with errors in variables
Résumé
In the regression model with errors in variables, we observe $n$ i.i.d. copies of $(Y,Z)$ satisfying $Y=f_{\theta^0}(X)+\xi$ and $Z=X+\varepsilon$ involving independent and unobserved random variables $X,\xi,\varepsilon$ plus a regression function $f_{\theta^0}$, known up to some finite dimensional $\theta^0$. The common densities of the $X_i$'s and of the $\xi_i$'s are unknown whereas the distribution of $\varepsilon$ is completely known. We aim at estimating the parameter $\theta^0$ by using the observations $(Y_1,Z_1),\cdots, (Y_n,Z_n)$. We propose two estimation procedures based on the least square criterion $\tilde S_{\theta^0,g}(\theta)=\mathbb{E}_{\theta^0,g}[((Y-f_\theta(X))^2w(X)]$ where $w$ is some weight function, to be chosen. In the first estimation procedure, $w$ does not depend on $\theta$ and the distribution of the $\xi$'s is unknown. The second estimation procedure is based on $S_{\theta^0,g}(\theta)=\mathbb{E}_{\theta^0,g}[((Y-f_\theta(X))^2-\sigma_{\xi,2}^2)w_\theta(X)]$ where $w_\theta$ is positive weight function, to be chosen, and requires the knowledge of $\sigma_{\xi,2}^2=\mbox{Var}(\xi)$. In both cases, we propose two estimators and derive upper bounds for the risk of those estimators, depending on the smoothness of the errors density $p_\varepsilon$ and on the smoothness properties of $w(x)f_\theta(x)$ or $w_\theta(x)f_\theta(x)$ with respect to $x$. Furthermore we give sufficient conditions that ensure that the parametric rate of convergence is achieved. We provide practical recipes for the choice of $w$ or $ w_\theta$ in the case of nonlinear regressionfunctions which are smooth on pieces allowing to gain in the order of the rate of convergence, up to the parametric rate in some cases.
Loading...