Paying more attention to attention: improving the performance of convolutional neural networks via attention transfer - Archive ouverte HAL
Communication Dans Un Congrès Année : 2017

Paying more attention to attention: improving the performance of convolutional neural networks via attention transfer

Nikos Komodakis
  • Fonction : Auteur
  • PersonId : 945099
Sergey Zagoruyko
  • Fonction : Auteur
  • PersonId : 1004623

Résumé

Attention plays a critical role in human visual experience. Furthermore, it has recently been demonstrated that attention can also play an important role in the context of applying artificial neural networks to a variety of tasks from fields such as computer vision and NLP. In this work we show that, by properly defining attention for convolutional neural networks, we can actually use this type of information in order to significantly improve the performance of a student CNN network by forcing it to mimic the attention maps of a powerful teacher network. To that end, we propose several novel methods of transferring attention, showing consistent improvement across a variety of datasets and convolutional neural network architectures. Code and models for our experiments are available at this https URL : https://github.com/szagoruyko/attention-transfer

Dates et versions

hal-01832769 , version 1 (09-07-2018)

Identifiants

Citer

Nikos Komodakis, Sergey Zagoruyko. Paying more attention to attention: improving the performance of convolutional neural networks via attention transfer. ICLR, Jun 2017, Paris, France. ⟨hal-01832769⟩
2062 Consultations
0 Téléchargements

Altmetric

Partager

More