Deep networks with ReLU activation functions can be smooth statistical models - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Deep networks with ReLU activation functions can be smooth statistical models

Résumé

Most Deep neural networks use ReLU activation functions. Since these functions are not differentiable in 0, we may believe that such models may have irregular behavior. In this paper, we will show that the issue is more in the data than in the model, and if the data are “smooth”, the model will be differentiable in a suitable sense. We give a striking illustration of this fact with the example of adversarial attacks.
Fichier non déposé

Dates et versions

hal-03957143 , version 1 (26-01-2023)

Identifiants

Citer

Joseph Rynkiewicz. Deep networks with ReLU activation functions can be smooth statistical models. ESANN 2022 - European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Oct 2022, Bruges (Belgium) and online event, France. pp.229-234, ⟨10.14428/esann/2022.ES2022-20⟩. ⟨hal-03957143⟩

Collections

UNIV-PARIS1 SAMM
27 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More