Beyond calibration: estimating the grouping loss of modern neural networks - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Beyond calibration: estimating the grouping loss of modern neural networks

Résumé

The ability to ensure that a classifier gives reliable confidence scores is essential to ensure informed decision-making. To this end, recent work has focused on miscalibration, i.e., the over or under confidence of model scores. Yet calibration is not enough: even a perfectly calibrated classifier with the best possible accuracy can have confidence scores that are far from the true posterior probabilities. This is due to the grouping loss, created by samples with the same confidence scores but different true posterior probabilities. Proper scoring rule theory shows that given the calibration loss, the missing piece to characterize individual errors is the grouping loss. While there are many estimators of the calibration loss, none exists for the grouping loss in standard settings. Here, we propose an estimator to approximate the grouping loss. We show that modern neural network architectures in vision and NLP exhibit grouping loss, notably in distribution shifts settings, which highlights the importance of pre-production validation.
Fichier principal
Vignette du fichier
beyond_calibration.pdf (5.89 Mo) Télécharger le fichier
beyond_calibration.bst (26.34 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03829870 , version 1 (27-10-2022)
hal-03829870 , version 2 (01-12-2022)
hal-03829870 , version 3 (07-12-2022)
hal-03829870 , version 4 (26-04-2023)

Licence

Paternité

Identifiants

Citer

Alexandre Perez-Lebel, Marine Le Morvan, Gaël Varoquaux. Beyond calibration: estimating the grouping loss of modern neural networks. ICLR 2023 – The Eleventh International Conference on Learning Representations, May 2023, Kigali, Rwanda. ⟨hal-03829870v4⟩
301 Consultations
108 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More