EV-GAN: Simulation of extreme events with ReLU neural networks - Archive ouverte HAL
Article Dans Une Revue Journal of Machine Learning Research Année : 2022

EV-GAN: Simulation of extreme events with ReLU neural networks

Résumé

Feedforward neural networks based on Rectified linear units (ReLU) cannot efficiently approximate quantile functions which are not bounded, especially in the case of heavy-tailed distributions. We thus propose a new parametrization for the generator of a Generative adversarial network (GAN) adapted to this framework, basing on extreme-value theory. An analysis of the uniform error between the extreme quantile and its GAN approximation is provided: We establish that the rate of convergence of the error is mainly driven by the second-order parameter of the data distribution. The above results are illustrated on simulated data and real financial data. It appears that our approach outperforms the classical GAN in a wide range of situations including high-dimensional and dependent data.
Fichier principal
Vignette du fichier
EV_GAN-HAL-v3.pdf (2.12 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03250663 , version 1 (04-06-2021)
hal-03250663 , version 2 (16-06-2021)
hal-03250663 , version 3 (22-03-2022)

Identifiants

  • HAL Id : hal-03250663 , version 3

Citer

Michaël Allouche, Stéphane Girard, Emmanuel Gobet. EV-GAN: Simulation of extreme events with ReLU neural networks. Journal of Machine Learning Research, 2022, 23 (150), pp.1--39. ⟨hal-03250663v3⟩
1201 Consultations
719 Téléchargements

Partager

More