Generative modeling of extremes with neural networks
Résumé
We investigate new parametrizations based on neural networks in order to approximate and sample multi-variate extreme values, especially in the case of heavy-tailed distributions. We discuss two approaches. First, transformations of Feedforward neural networks based on Rectified linear units (ReLU) are used. An analysis of the uniform error between the extreme quantile and its GAN approximation is provided, and shows that second-order parameters of the marginal data distributions play an important role. Second, eLU based NN are used, to efficiently get rid of the bias term in tails approximation, in the presence of arbitrary high-order parameters. These results are illustrated on synthetic and real data.