Preventing Dimensional Collapse in Contrastive Local Learning with Subsampling - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Preventing Dimensional Collapse in Contrastive Local Learning with Subsampling

Résumé

This paper presents an investigation of the challenges of training Deep Neural Networks (DNNs) via self-supervised objectives, using local learning as a parallelizable alternative to traditional backpropagation. In our approach, DNN are segmented into distinct blocks, each updated independently via gradients provided by small local auxiliary Neural Networks (NNs). Despite the evident computational benefits, extensive splits often result in performance degradation. Through analysis of a synthetic example, we identify a layer-wise dimensional collapse as a major factor behind such performance losses. To counter this, we propose a novel and straightforward sampling strategy based on blockwise feature-similarity, explicitly designed to evade such dimensional collapse.
Fichier principal
Vignette du fichier
main.pdf (1.32 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04156218 , version 1 (20-07-2023)

Identifiants

  • HAL Id : hal-04156218 , version 1

Citer

Louis Fournier, Adeetya Patel, Michael Eickenberg, Edouard Oyallon, Eugene Belilovsky. Preventing Dimensional Collapse in Contrastive Local Learning with Subsampling. ICML 2023 Workshop on Localized Learning (LLW), Jul 2023, Honolulu (Hawaii), USA, United States. ⟨hal-04156218⟩
79 Consultations
110 Téléchargements

Partager

Gmail Facebook X LinkedIn More