Transfer Learning by Weighting Convolution - Archive ouverte HAL
Communication Dans Un Congrès Année : 2020

Transfer Learning by Weighting Convolution

Stéphane Ayache
Ronan Sicre
  • Fonction : Auteur
  • PersonId : 1067988

Résumé

Transferring pretrained deep architectures to datasets with few labels is still a challenge in many real-world situations. This paper presents a new framework to understand convolutional neural networks, by establishing connections between Kronecker factorization and convolutional layers. We then introduce Convolution Weighting Layers that learn a vector of weights for each channel, allowing efficient transfer learning in small training settings, as well as enabling pruning the transferred models. Experiments are conducted on two main settings with few labeled data: transfer learning for classification and transfer learning for retrieval. Two well known convolutional architectures are evaluated on five public datasets. We show that weighting convolutions is efficient to adapt pretrained models to new tasks and that pruned networks conserve good performance.
Fichier principal
Vignette du fichier
wconv.pdf (838.86 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02544099 , version 1 (16-04-2020)

Identifiants

  • HAL Id : hal-02544099 , version 1

Citer

Stéphane Ayache, Ronan Sicre, Thierry Artières. Transfer Learning by Weighting Convolution. International Joint Conference on Neural Networks (IJCNN), 2020, Glasgow, United Kingdom. ⟨hal-02544099⟩
176 Consultations
458 Téléchargements

Partager

More