Can Unstructured Pruning Reduce the Depth in Deep Neural Networks? - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?

Résumé

Pruning is a widely used technique for reducing the size of deep neural networks while maintaining their performance. However, such a technique, despite being able to massively compress deep models, is hardly able to remove entire layers from a model (even when structured): is this an addressable task? In this study, we introduce EGP, an innovative Entropy Guided Pruning algorithm aimed at reducing the size of deep neural networks while preserving their performance. The key focus of EGP is to prioritize pruning connections in layers with low entropy, ultimately leading to their complete removal. Through extensive experiments conducted on popular models like ResNet-18 and Swin-T, our findings demonstrate that EGP effectively compresses deep neural networks while maintaining competitive performance levels. Our results not only shed light on the underlying mechanism behind the advantages of unstructured pruning, but also pave the way for further investigations into the intricate relationship between entropy, pruning techniques, and deep learning performance. The EGP algorithm and its insights hold great promise for advancing the field of network compression and optimization. The source code for EGP is released open-source.

Dates et versions

hal-04254405 , version 1 (23-10-2023)

Identifiants

Citer

Zhu Liao, Victor Quétu, Van-Tam Nguyen, Enzo Tartaglione. Can Unstructured Pruning Reduce the Depth in Deep Neural Networks?. Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, Oct 2023, Paris, France. ⟨hal-04254405⟩
74 Consultations
0 Téléchargements

Altmetric

Partager

More