Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2019

Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models

Résumé

Mixtures-of-Experts (MoE) are conditional mixture models that have shown their performance in modeling heterogeneity in data in many statistical learning approaches for prediction, including regression and classification, as well as for clustering. Their estimation in high-dimensional problems is still however challenging. We consider the problem of parameter estimation and feature selection in MoE models with different generalized linear experts models, and propose a regularized maximum likelihood estimation that efficiently encourages sparse solutions for heterogeneous data with high-dimensional predictors. The developed proximal-Newton EM algorithm includes proximal Newton-type procedures to update the model parameter by monotonically maximizing the objective function and allows to perform efficient estimation and feature selection. An experimental study shows the good performance of the algorithms in terms of recovering the actual sparse solutions, parameter estimation, and clustering of heterogeneous regression data, compared to the main state-of-the art competitors.
Fichier non déposé

Dates et versions

hal-03584734 , version 1 (22-02-2022)

Identifiants

  • HAL Id : hal-03584734 , version 1

Citer

Bao-Tuyen Huynh, Faicel Chamroukhi. Estimation and Feature Selection in Mixtures of Generalized Linear Experts Models. 2019. ⟨hal-03584734⟩
11 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More