A soft nearest-neighbor framework for continual semi-supervised learning - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

A soft nearest-neighbor framework for continual semi-supervised learning

Résumé

Despite significant advances, the performance of state-of-the-art continual learning approaches hinges on the unrealistic scenario of fully labeled data. In this paper, we tackle this challenge and propose an approach for continual semi-supervised learning---a setting where not all the data samples are labeled. An underlying issue in this scenario is the model forgetting representations of unlabeled data and overfitting the labeled ones. We leverage the power of nearest-neighbor classifiers to non-linearly partition the feature space and learn a strong representation for the current task, as well as distill relevant information from previous tasks. We perform a thorough experimental evaluation and show that our method outperforms all the existing approaches by large margins, setting a strong state of the art on the continual semi-supervised learning paradigm. For example, on CIFAR100 we surpass several others even when using at least 30 times less supervision (0.8% vs. 25% of annotations). The code is publicly available on https://github.com/kangzhiq/NNCSL
Fichier principal
Vignette du fichier
Paper_for_arXiv.pdf (842.33 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03893056 , version 1 (10-12-2022)
hal-03893056 , version 2 (05-04-2023)
hal-03893056 , version 3 (11-09-2023)

Identifiants

  • HAL Id : hal-03893056 , version 1

Citer

Zhiqi Kang, Enrico Fini, Moin Nabi, Elisa Ricci, Karteek Alahari. A soft nearest-neighbor framework for continual semi-supervised learning. 2022. ⟨hal-03893056v1⟩
141 Consultations
65 Téléchargements

Partager

Gmail Facebook X LinkedIn More