New metrics for analyzing continual learners - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

New metrics for analyzing continual learners

Résumé

Deep neural networks have shown remarkable performance when trained on independent and identically distributed data from a fixed set of classes. However, in realworld scenarios, it can be desirable to train models on a continuous stream of data where multiple classification tasks are presented sequentially. This scenario, known as Continual Learning (CL) poses challenges to standard learning algorithms which struggle to maintain knowledge of old tasks while learning new ones. This stability-plasticity dilemma remains central to CL and multiple metrics have been proposed to adequately measure stability and plasticity separately. However, none considers the increasing difficulty of the classification task, which inherently results in performance loss for any model. In that sense, we analyze some limitations of current metrics and identify the presence of setup-induced forgetting. Therefore, we propose new metrics that account for the task's increasing difficulty. Through experiments on benchmark datasets, we demonstrate that our proposed metrics can provide new insights into the stability-plasticity trade-off achieved by models in the continual learning environment.
Fichier principal
Vignette du fichier
2309.00462.pdf (412.64 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04425466 , version 1 (30-01-2024)

Identifiants

Citer

Nicolas Michel, Giovanni Chierchia, Romain Negrel, Jean-François Bercher, Toshihiko Yamasaki. New metrics for analyzing continual learners. MIRU2023 — Symposium on Image Recognition and Understanding, Jul 2023, Hamamatsu, France. ⟨10.48550/arXiv.2309.00462⟩. ⟨hal-04425466⟩
10 Consultations
12 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More