Shannon Strikes Again! Entropy-Based Pruning in Deep Neural Networks for Transfer Learning Under Extreme Memory and Computation Budgets - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Shannon Strikes Again! Entropy-Based Pruning in Deep Neural Networks for Transfer Learning Under Extreme Memory and Computation Budgets

Résumé

Deep neural networks have become the de-facto standard across various computer science domains. Nonetheless, effectively training these deep networks remains challenging and resource-intensive. This paper investigates the efficacy of pruned deep learning models in transfer learning scenarios under extremely low memory budgets, tailored for TinyML models. Our study reveals that the source task's model with the highest activation entropy outperforms others in the target task. Motivated by this, we propose an entropy-based Efficient Neural Transfer with Reduced Overhead via PrunIng (ENTROPI) algorithm. Through comprehensive experiments on diverse models (ResNet18 and MobileNet-v3) and target datasets (CIFAR-100, VLCS, and PACS), we substantiate the superior generalization achieved by transfer learning from the entropy-pruned model. Quantitative measures for entropy provide valuable insights into the reasons behind the observed performance improvements. The results underscore ENTROPI's potential as an efficient solution for enhancing generalization in data-limited transfer learning tasks.
Fichier principal
Vignette du fichier
Transfer_Learning_Pruning (1).pdf (453.59 Ko) Télécharger le fichier
Spadaro_Shannon_Strikes_Again_Entropy-Based_Pruning_in_Deep_Neural_Networks_for_ICCVW_2023_paper.pdf (568.58 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04269159 , version 1 (06-11-2023)

Identifiants

  • HAL Id : hal-04269159 , version 1

Citer

Gabriele Spadaro, Riccardo Renzulli, Andrea Bragagnolo, Jhony H. Giraldo, Attilio Fiandrotti, et al.. Shannon Strikes Again! Entropy-Based Pruning in Deep Neural Networks for Transfer Learning Under Extreme Memory and Computation Budgets. Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, Oct 2023, Paris, France. pp.1518-1522. ⟨hal-04269159⟩
47 Consultations
37 Téléchargements

Partager

More