The adaptive BerHu penalty in robust regression - Archive ouverte HAL
Article Dans Une Revue Journal of Nonparametric Statistics Année : 2016

The adaptive BerHu penalty in robust regression

Laurent Zwald

Résumé

We intend to combine Huber's loss with an adaptive reversed version as a penalty function. The purpose is twofold: first we would like to propose an estimator that is robust to data subject to heavy-tailed errors or outliers. Second we hope to overcome the variable selection problem in presence of highly correlated predictors. For instance, in this framework, the adaptive least absolute shrinkage and selection operator (lasso) is not a very satisfactory variable selection method, although it is a popular technique for simultaneous estimation and variable selection. We call this new penalty adaptive BerHu penalty. As for elastic net penalty, small coefficients contribute through their 1 norm to this penalty while larger coefficients cause it to grow quadratically (as ridge regression). We will show that the estimator associated with Huber's loss combined with adaptive BerHu penalty enjoys theoretical properties in the fixed design context. This approach is compared to existing regularization methods such as adaptive elastic net and is illustrated via simulation studies and real data.
Fichier principal
Vignette du fichier
BerhuOctobrerevision2.pdf (363.41 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01882461 , version 1 (27-09-2018)

Identifiants

Citer

Sophie Lambert-Lacroix, Laurent Zwald. The adaptive BerHu penalty in robust regression. Journal of Nonparametric Statistics, 2016, 28 (3), pp.487 - 514. ⟨10.1080/10485252.2016.1190359⟩. ⟨hal-01882461⟩
2122 Consultations
651 Téléchargements

Altmetric

Partager

More