Zero-Overhead Protection for CNN Weights - Archive ouverte HAL Access content directly
Conference Papers Year :

Zero-Overhead Protection for CNN Weights

Abstract

The numerical format used for representing weights and activations plays a key role in the computational efficiency and robustness of CNNs. Recently, a 16-bit floating point format called Brain-Float 16 (bf16) has been proposed and implemented in hardware accelerators. However, the robustness of accelerators implemented with this format has not yet been studied. In this paper, we perform a comparison of the robustness of state-of-the art CNNs implemented with 8-bit integer, Brain-Float 16 and 32bit floating point formats. We also introduce an error detection and masking technique, called opportunistic parity (OP), which can detect and mask errors in the weights with zero storage overhead. With this technique, the robustness of floating point weights to bit-flips can be improved by up to three orders of magnitude.
Fichier principal
Vignette du fichier
opportunistic_parity_dfts (2).pdf (344.09 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03470345 , version 1 (08-12-2021)

Identifiers

Cite

Stéphane Burel, Adrian Evans, Lorena Anghel. Zero-Overhead Protection for CNN Weights. 2021 IEEE International Symposium on Defect and Fault Tolerance in VLSI and Nanotechnology Systems (DFT), Oct 2021, Athens (virtual), Greece. ⟨10.1109/DFT52944.2021.9568363⟩. ⟨hal-03470345⟩
22 View
117 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More