Approximation results regarding the multiple-output Gaussian gated mixture of linear experts model - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Neurocomputing Année : 2019

Approximation results regarding the multiple-output Gaussian gated mixture of linear experts model

Résumé

Mixture of experts (MoE) models are a class of artificial neural networks that can be used for functional approximation and probabilistic modeling. An important class of MoE models is the class of mixture of linear experts (MoLE) models, where the expert functions map to real topological output spaces. Recently, Gaussian gated MoLE models have become popular in applied research. There are a number of powerful approximation results regarding Gaussian gated MoLE models, when the output space is univariate. These results guarantee the ability of Gaussian gated MoLE mean functions to approximate arbitrary continuous functions, and Gaussian gated MoLE models themselves to approximate arbitrary conditional probability density functions. We utilize and extend upon the univariate approximation results in order to prove a pair of useful results for situations where the output spaces are multivariate. We do this by proving a pair of lemmas regarding the combination of univariate MoLE models, which are interesting in their own rights.
Fichier principal
Vignette du fichier
20190513_Manuscript_R1.pdf (322.15 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02265793 , version 1 (12-08-2019)

Identifiants

Citer

Hien Duy Nguyen, Faicel Chamroukhi, Florence Forbes. Approximation results regarding the multiple-output Gaussian gated mixture of linear experts model. Neurocomputing, 2019, 366, pp.208-214. ⟨10.1016/j.neucom.2019.08.014⟩. ⟨hal-02265793⟩
122 Consultations
251 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More