To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding

Résumé

Batch-Normalization (BN) layers have become fundamental components in the evermore complex deep neural network architectures. Such models require acceleration processes for deployment on edge devices. However, BN layers add computation bottlenecks due to the sequential operation processing: thus, a key, yet often overlooked component of the acceleration process is BN layers folding. In this paper, we demonstrate that the current BN folding approaches are suboptimal in terms of how many layers can be removed. We therefore provide a necessary and sufficient condition for BN folding and a corresponding optimal algorithm. The proposed approach systematically outperforms existing baselines and allows to dramatically reduce the inference time of deep neural networks.

Dates et versions

hal-03953581 , version 1 (24-01-2023)

Identifiants

Citer

Edouard Yvinec, Arnaud Dapogny, Kevin Bailly. To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding. Thirty-First International Joint Conference on Artificial Intelligence (IJCAI 22), Jul 2022, Vienna, Austria. pp.1601-1607, ⟨10.24963/ijcai.2022/223⟩. ⟨hal-03953581⟩
17 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More