ABBA Neural Networks: Coping with Positivity, Expressivity, and Robustness - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue SIAM Journal on Mathematics of Data Science Année : 2024

ABBA Neural Networks: Coping with Positivity, Expressivity, and Robustness

Résumé

We introduce ABBA networks, a novel class of (almost) non-negative neural networks, which are shown to possess a series of appealing properties. In particular, we demonstrate that these networks are universal approximators while enjoying the advantages of non-negative weighted networks. We derive tight Lipschitz bounds both in the fully connected and convolutional cases. We propose a strategy for designing ABBA nets that are robust against adversarial attacks, by finely controlling the Lipschitz constant of the network during the training phase. We show that our method outperforms other state-of-the-art defenses against adversarial white-box attackers. Experiments are performed on image classification tasks on four benchmark datasets.
Fichier principal
Vignette du fichier
siamonline_220329.pdf (7.03 Mo) Télécharger le fichier
supplemental.pdf (5.59 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04386260 , version 1 (10-01-2024)

Licence

Paternité - Pas d'utilisation commerciale

Identifiants

  • HAL Id : hal-04386260 , version 1

Citer

Ana-Antonia Neacșu, Jean-Christophe Pesquet, Vlad Vasilescu, Corneliu Burileanu. ABBA Neural Networks: Coping with Positivity, Expressivity, and Robustness. SIAM Journal on Mathematics of Data Science, In press. ⟨hal-04386260⟩
75 Consultations
54 Téléchargements

Partager

Gmail Facebook X LinkedIn More