Deep neural network-based CHARME models with infinite memory - Archive ouverte HAL Accéder directement au contenu
Poster De Conférence Année : 2019

Deep neural network-based CHARME models with infinite memory

Résumé

We consider a model called CHARME (Conditional Heteroscedastic Autoregressive Mixture of Experts), a class of generalized mixture of nonlinear nonparametric AR-ARCH times series. Under certain Lipschitz-type conditions on the autoregressive and volatility functions, we prove that this model is $\tau$-weakly dependent in the sense of Dedecker & Prieur (2004), and therefore, ergodic and stationary. This result forms the theoretical basis for deriving an asymptotic theory of the underlying nonparametric estimation. As application, for the case of a single expert, we use the universal approximation property of the neural networks in order to develop an estimation theory for the autoregressive function by deep neural networks, where the consistency of the estimator of neurons and bias are guaranteed.
Fichier principal
Vignette du fichier
conference_poster_DS3.pdf (213.18 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02518028 , version 1 (24-03-2020)

Identifiants

  • HAL Id : hal-02518028 , version 1

Citer

José G. Gómez-García, Jalal Fadili, Christophe Chesneau. Deep neural network-based CHARME models with infinite memory. Data Science Summer School (DS3), Jun 2019, Paris - Saclay, France. ⟨hal-02518028⟩
86 Consultations
37 Téléchargements

Partager

Gmail Facebook X LinkedIn More