SIT: Stochastic Input Transformation to Defend Against Adversarial Attacks on Deep Neural Networks - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Design & Test Année : 2022

SIT: Stochastic Input Transformation to Defend Against Adversarial Attacks on Deep Neural Networks

Résumé

Deep Neural Networks (DNNs) have been deployed in a wide range of applications, including safety-critical domains, owing to their proven efficiency in solving complex problems. However, these systems have been shown vulnerable to adversarial attacks: carefully crafted perturbations that threaten their integrity and trustworthiness. Several defenses have been recently proposed. However, most of these techniques are costly to deploy since they require retraining and specific fine-tuning procedures. While there are pre-processing defenses that do not require retraining, these were shown to be ineffective against adaptive white-box attacks. In this paper, we propose a model-agnostic defense against adversarial attacks using stochastic pre-processing. Based on a process of down-sampling/up-sampling, we transform the input to a new sample that is: (i) close enough to the initial input to be classified correctly, and (ii) different enough to ignore any potential adversarial noise within it. The proposed defense is generic, easy to deploy and does not require any specific training or fine tuning. We tested our technique comparatively to state-of-the-art defenses under grey-box and strong white-box scenarios. Experimental results show that our defense achieves robustness of up to 94% and 93% against PGD and Cand#x0026;W attacks, respectively, under strong white-box scenario. IEEE
Fichier non déposé

Dates et versions

hal-03546310 , version 1 (27-01-2022)

Identifiants

Citer

Amira Guesmi, Ihsen Alouani, Mouna Baklouti, Tarek Frikha, Mohamed Abid. SIT: Stochastic Input Transformation to Defend Against Adversarial Attacks on Deep Neural Networks. IEEE Design & Test, 2022, 39, pp.63 - 72. ⟨10.1109/MDAT.2021.3077542⟩. ⟨hal-03546310⟩
32 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More