Convolutional network fabric pruning with label noise - Archive ouverte HAL
Article Dans Une Revue Artificial Intelligence Review Année : 2023

Convolutional network fabric pruning with label noise

Résumé

This paper presents an iterative pruning strategy for Convolutional Network Fabrics (CNF) in presence of noisy training and testing data. With the continuous increase in size of neural network models, various authors have developed pruning approaches to build more compact network structures requiring less resources, while preserving performance. As we show in this paper, because of their intrinsic structure and function, Convolutional Network Fabrics are ideal candidates for pruning. We present a series of pruning strategies that can significantly reduce both the final network size and required training time by pruning either entire convolutional filters or individual weights, so that the grid remains visually understandable but that overall execution quality stays within controllable boundaries. Our approach can be iteratively applied during training so that the network complexity decreases rapidly, saving computational time. The paper addresses both data-dependent and dataindependent strategies, and also experimentally establishes the most efficient approaches when training or testing data contain annotation errors.
Fichier principal
Vignette du fichier
sn-article.pdf (436.16 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03569057 , version 1 (12-02-2022)

Identifiants

Citer

Ilias Benjelloun, Bart Lamiroy, Efoevi Angelo Koudou. Convolutional network fabric pruning with label noise. Artificial Intelligence Review, 2023, 56 (12), pp.14841-14864. ⟨10.1007/s10462-023-10507-2⟩. ⟨hal-03569057⟩
102 Consultations
87 Téléchargements

Altmetric

Partager

More