Exploring Weight Symmetry in Deep Neural Networks - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Computer Vision and Image Understanding Année : 2019

Exploring Weight Symmetry in Deep Neural Networks

Résumé

We propose to impose symmetry in neural network parameters to improve parameter usage and make use of dedicated convolution and matrix multiplication routines. Due to significant reduction in the number of parameters as a result of the symmetry constraints, one would expect a dramatic drop in accuracy. Surprisingly, we show that this is not the case, and, depending on network size, symmetry can have little or no negative effect on network accuracy, especially in deep overparameterized networks. We propose several ways to impose local symmetry in recurrent and convolutional neural networks, and show that our symmetry parameterizations satisfy universal approximation property for single hidden layer networks. We extensively evaluate these parameterizations on CIFAR, ImageNet and language modeling datasets, showing significant benefits from the use of symmetry. For instance, our ResNet-101 with channel-wise symmetry has almost 25% less parameters and only 0.2% accuracy loss on ImageNet. Code for our experiments is available at https://github.com/hushell/deep-symmetry
Fichier principal
Vignette du fichier
S107731421930102X.pdf (493.03 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01978633 , version 1 (20-07-2022)

Licence

Paternité - Pas d'utilisation commerciale

Identifiants

Citer

Xu Shell Hu, Sergey Zagoruyko, Nikos Komodakis. Exploring Weight Symmetry in Deep Neural Networks. Computer Vision and Image Understanding, 2019, 187, ⟨10.1016/j.cviu.2019.07.006⟩. ⟨hal-01978633⟩
198 Consultations
83 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More