Domain-Aware Augmentations for Unsupervised Online General Continual Learning - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Domain-Aware Augmentations for Unsupervised Online General Continual Learning

Résumé

Continual Learning has been challenging, especially when dealing with unsupervised scenarios such as Unsupervised Online General Continual Learning (UOGCL), where the learning agent has no prior knowledge of class boundaries or task change information. While previous research has focused on reducing forgetting in supervised setups, recent studies have shown that self-supervised learners are more resilient to forgetting. This paper proposes a novel approach that enhances memory usage for contrastive learning in UOGCL by defining and using stream-dependent data augmentations together with some implementation tricks. Our proposed method is simple yet effective, achieves state-of-the-art results compared to other unsupervised approaches in all considered setups, and reduces the gap between supervised and unsupervised continual learning. Our domain-aware augmentation procedure can be adapted to other replay-based methods, making it a promising strategy for continual learning.
Fichier principal
Vignette du fichier
0452.pdf (670.23 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-04425453 , version 1 (30-01-2024)

Identifiants

Citer

Nicolas Michel, Romain Negrel, Giovanni Chierchia, Jean-François Bercher. Domain-Aware Augmentations for Unsupervised Online General Continual Learning. The 34th British Machine Vision Conference, Nov 2023, Aberdeen, United Kingdom. ⟨10.48550/arXiv.2309.06896⟩. ⟨hal-04425453⟩
12 Consultations
19 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More