Zero-Overhead Protection for CNN Weights
Résumé
The numerical format used for representing weights and activations plays a key role in the computational efficiency and robustness of CNNs. Recently, a 16-bit floating point format called Brain-Float 16 (bf16) has been proposed and implemented in hardware accelerators. However, the robustness of accelerators implemented with this format has not yet been studied. In this paper, we perform a comparison of the robustness of state-of-the art CNNs implemented with 8-bit integer, Brain-Float 16 and 32bit floating point formats. We also introduce an error detection and masking technique, called opportunistic parity (OP), which can detect and mask errors in the weights with zero storage overhead. With this technique, the robustness of floating point weights to bit-flips can be improved by up to three orders of magnitude.
Origine | Fichiers produits par l'(les) auteur(s) |
---|