Asymptotics for Regression Models Under Loss of Identifiability - Archive ouverte HAL
Article Dans Une Revue Sankhya A Année : 2016

Asymptotics for Regression Models Under Loss of Identifiability

Résumé

This paper discusses the asymptotic behavior of regression models under general conditions, especially if the dimensionality of the set of true parameters is larger than zero and the true model is not identifiable. Firstly, we give a general inequality for the difference of the sum of square errors (SSE) of the estimated regression model and the SSE of the theoretical true regression function in our model. A set of generalized derivative functions is a key tool in deriving such inequality. Under suitable Donsker condition for this set, we provide the asymptotic distribution for the difference of SSE. We show how to get this Donsker property for parametric models even though the parameters characterizing the best regression function are not unique. This result is applied to neural networks regression models with redundant hidden units when loss of identifiability occurs and gives some hints on how penalizing such models to avoid over-fitting.
Fichier non déposé

Dates et versions

hal-01520204 , version 1 (10-05-2017)

Identifiants

  • HAL Id : hal-01520204 , version 1

Citer

Joseph Rynkiewicz. Asymptotics for Regression Models Under Loss of Identifiability. Sankhya A, 2016, 78 (2), pp.155-179. ⟨hal-01520204⟩
46 Consultations
0 Téléchargements

Partager

More