Phase Collapse in Neural Networks - Archive ouverte HAL
Communication Dans Un Congrès Année : 2022

Phase Collapse in Neural Networks

Florentin Guth
  • Fonction : Auteur
  • PersonId : 1093562
John Zarka
  • Fonction : Auteur
  • PersonId : 1113325

Résumé

Deep convolutional classifiers linearly separate image classes and improve accuracy as depth increases. They progressively reduce the spatial dimension whereas the number of channels grows with depth. Spatial variability is therefore transformed into variability along channels. A fundamental challenge is to understand the role of non-linearities together with convolutional filters in this transformation. ReLUs with biases are often interpreted as thresholding operators that improve discrimination through sparsity. This paper demonstrates that it is a different mechanism called phase collapse which eliminates spatial variability while linearly separating classes. We show that collapsing the phases of complex wavelet coefficients is sufficient to reach the classification accuracy of ResNets of similar depths. However, replacing the phase collapses with thresholding operators that enforce sparsity considerably degrades the performance. We explain these numerical results by showing that the iteration of phase collapses progressively improves separation of classes, as opposed to thresholding non-linearities.
Fichier principal
Vignette du fichier
ICLR2022.pdf (1.37 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03373703 , version 1 (13-10-2021)
hal-03373703 , version 2 (24-03-2022)

Licence

Identifiants

Citer

Florentin Guth, John Zarka, Stephane Mallat. Phase Collapse in Neural Networks. International Conference on Learning Representations, 2022, Apr 2022, Online, France. ⟨hal-03373703v2⟩
154 Consultations
126 Téléchargements

Altmetric

Partager

More