Beyond the Norms: Detecting Prediction Errors in Regression Models - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

Beyond the Norms: Detecting Prediction Errors in Regression Models

Résumé

This paper tackles the challenge of detecting unreliable behavior in regression algorithms, which may arise from intrinsic variability (e.g., aleatoric uncertainty) or modeling errors (e.g., model uncertainty). First, we formally introduce the notion of unreliability in regression, i.e., when the output of the regressor exceeds a specified discrepancy (or error). Then, using powerful tools for probabilistic modeling, we estimate the discrepancy density, and we measure its statistical diversity using our proposed metric for statistical dissimilarity. In turn, this allows us to derive a data-driven score that expresses the uncertainty of the regression outcome. We show empirical improvements in error detection for multiple regression tasks, consistently outperforming popular baseline approaches, and contributing to the broader field of uncertainty quantification and safe machine learning systems.
Fichier principal
Vignette du fichier
ICML2024_Safe_Regression-6.pdf (1.05 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04575936 , version 1 (31-05-2024)

Identifiants

  • HAL Id : hal-04575936 , version 1

Citer

Andres Altieri, Marco Romanelli, Pichler Georg, Florence Alberge, Pablo Piantanida. Beyond the Norms: Detecting Prediction Errors in Regression Models. Forty-first International Conference on Machine Learning (ICML 2024), Jul 2024, Vienna, Austria. ⟨hal-04575936⟩
368 Consultations
85 Téléchargements

Partager

More