Modeling and scoring dynamic probabilistic forecasts
Résumé
Probabilistic forecast plays a major role in many applications where forecast is needed together with an assessment of its uncertainty. Verification of probabilistic forecast has become increasingly important and mostly relies on two sets of tools: scoring rules and calibration diagnostics. Proper scoring rules assign forecasts numerical scores such that the correct forecast achieves a minimal expected score. Calibration theory aims at verifying that the observations and the forecasts are consistent. In practice, using a probabilistic forecast commonly involves a sequential decision making process where the environment evolves over time. In this article, we propose a mathematical framework for dynamic probabilistic forecast. The forecasts take therein the form of stochastic processes adapted to a filtration that encodes the available information. Under minimal assumptions, we show that proper scoring rules can still be used in this dynamic framework to discriminate the ideal forecast-more precisely, we prove that the long term average score is close to minimum if and only if the forecasts are close to ideal. Some connections are also done in terms of Wasserstein distance, and links are given between scoring rules and reproducing kernels.
Domaines
Statistiques [math.ST]Origine | Fichiers produits par l'(les) auteur(s) |
---|