Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Statistical Distributions and Applications Année : 2021

Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models

Résumé

Abstract Mixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are both compactly supported. We further prove an almost uniform convergence result when the input is univariate. Auxiliary lemmas are proved regarding the richness of the soft-max gating function class, and their relationships to the class of Gaussian gating functions.

Dates et versions

hal-03328115 , version 1 (28-08-2021)

Identifiants

Citer

Hien Duy Nguyen, Trungtin Nguyen, Faicel Chamroukhi, Geoffrey John Mclachlan. Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models. Journal of Statistical Distributions and Applications, 2021, 8 (13), ⟨10.1186/s40488-021-00125-0⟩. ⟨hal-03328115⟩
50 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More