Grow, prune or select data: which technique allows the most energy-efficient neural network training? - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

Grow, prune or select data: which technique allows the most energy-efficient neural network training?

Résumé

The training energy efficiency of deep neural networks became an extensively studied research topic in the last years. Some of the existing approaches seek to reduce the size of the architecture by either starting the training with a large network and pruning it, or by beginning with a seed architecture and then growing it. Instead of compressing the architecture, other approaches aim to reduce the number of training examples through data selection. While various approaches belonging to these two categories have been proposed, only a few works actually conduct energy measurements. Others merely mention potential gains in efficiency or rely on alternative evaluation metrics such as FLOPs. In this paper, we conduct a series of experiments both on a synthetic dataset and on image classification benchmarks in order to compare the impact of pruning, architecture growing and data selection on training energy consumption and prediction quality. Our results show that growing maintains a high prediction quality but brings limited energy gains when the size of the resulting architecture is large. Pruning can offer high gains, but also impacts accuracy, making it more suited for large models. Data selection provides energy gains correlated with the selectivity rate but causes an accuracy loss. We find that the effectiveness of every technique depends on its hyperparameters and on the architecture size.
Fichier principal
Vignette du fichier
ictai_paper_99.pdf (1.11 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04282146 , version 1 (13-11-2023)

Identifiants

Citer

Anais Boumendil, Walid Bechkit, Pierre-Edouard Portier, Frédéric Le Mouël, Malcolm Egan. Grow, prune or select data: which technique allows the most energy-efficient neural network training?. 2023 IEEE 35th International Conference on Tools with Artificial Intelligence (ICTAI), Nov 2023, Atlanta (GA), United States. ⟨10.1109/ICTAI59109.2023.00051⟩. ⟨hal-04282146⟩
210 Consultations
247 Téléchargements

Altmetric

Partager

More