Convolutional network fabric pruning with label noise - Archive ouverte HAL Access content directly
Journal Articles Artificial Intelligence Review Year : 2023

Convolutional network fabric pruning with label noise

Abstract

This paper presents an iterative pruning strategy for Convolutional Network Fabrics (CNF) in presence of noisy training and testing data. With the continuous increase in size of neural network models, various authors have developed pruning approaches to build more compact network structures requiring less resources, while preserving performance. As we show in this paper, because of their intrinsic structure and function, Convolutional Network Fabrics are ideal candidates for pruning. We present a series of pruning strategies that can significantly reduce both the final network size and required training time by pruning either entire convolutional filters or individual weights, so that the grid remains visually understandable but that overall execution quality stays within controllable boundaries. Our approach can be iteratively applied during training so that the network complexity decreases rapidly, saving computational time. The paper addresses both data-dependent and dataindependent strategies, and also experimentally establishes the most efficient approaches when training or testing data contain annotation errors.
Fichier principal
Vignette du fichier
sn-article.pdf (436.16 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03569057 , version 1 (12-02-2022)

Identifiers

Cite

Ilias Benjelloun, Bart Lamiroy, Efoevi Angelo Koudou. Convolutional network fabric pruning with label noise. Artificial Intelligence Review, 2023, 56 (12), pp.14841-14864. ⟨10.1007/s10462-023-10507-2⟩. ⟨hal-03569057⟩
82 View
63 Download

Altmetric

Share

Gmail Mastodon Facebook X LinkedIn More