SALAD: Self-Assessment Learning for Action Detection - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

SALAD: Self-Assessment Learning for Action Detection

Résumé

Literature on self-assessment in machine learning mainly focuses on the production of well-calibrated algorithms through consensus frameworks i.e. calibration is seen as a problem. Yet, we observe that learning to be properly confident could behave like a powerful regularization and thus, could be an opportunity to improve performance. Precisely, we show that used within a framework of action detection, the learning of a self-assessment score is able to improve the whole action localization process. Experimental results show that our approach outperforms the state-of-the-art on two action detection benchmarks. On THUMOS14 dataset, the mAP at tIoU @0.5 is improved from 42.8% to 44.6%, and from 50.4% to 51.7% on Activ-ityNet1.3 dataset. For lower tIoU values, we achieve even more significant improvements on both datasets.
Fichier principal
Vignette du fichier
Vaudaux-Ruth_SALAD_Self-Assessment_Learning_for_Action_Detection_WACV_2021_paper.pdf (1.79 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03156960 , version 1 (10-11-2020)
hal-03156960 , version 2 (02-03-2021)

Identifiants

  • HAL Id : hal-03156960 , version 2

Citer

Guillaume Vaudaux-Ruth, Adrien Chan-Hon-Tong, Catherine Achard. SALAD: Self-Assessment Learning for Action Detection. IEEE/CVF Winter Conference on Applications of Computer Vision, Jan 2021, virtual, United States. ⟨hal-03156960v2⟩
96 Consultations
133 Téléchargements

Partager

More