ABBA Neural Networks: Coping with Positivity, Expressivity, and Robustness - Archive ouverte HAL
Article Dans Une Revue SIAM Journal on Mathematics of Data Science Année : 2024

ABBA Neural Networks: Coping with Positivity, Expressivity, and Robustness

Résumé

We introduce ABBA networks, a novel class of (almost) non-negative neural networks, which are shown to possess a series of appealing properties. In particular, we demonstrate that these networks are universal approximators while enjoying the advantages of non-negative weighted networks. We derive tight Lipschitz bounds both in the fully connected and convolutional cases. We propose a strategy for designing ABBA nets that are robust against adversarial attacks, by finely controlling the Lipschitz constant of the network during the training phase. We show that our method outperforms other state-of-the-art defenses against adversarial white-box attackers. Experiments are performed on image classification tasks on four benchmark datasets.
Fichier principal
Vignette du fichier
ABBA_Neural_Networks.pdf (12.69 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04386260 , version 1 (10-01-2024)
hal-04386260 , version 2 (26-07-2024)

Licence

Identifiants

Citer

Ana-Antonia Neacșu, Jean-Christophe Pesquet, Vlad Vasilescu, Corneliu Burileanu. ABBA Neural Networks: Coping with Positivity, Expressivity, and Robustness. SIAM Journal on Mathematics of Data Science, In press, 6 (3), ⟨10.1137/23M1589591⟩. ⟨hal-04386260v2⟩
271 Consultations
266 Téléchargements

Altmetric

Partager

More