Full text

Turn on search term navigation

Copyright © 2021 Peng Wang et al. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

In recent years, a series of researches have revealed that the Deep Neural Network (DNN) is vulnerable to adversarial attack, and a number of attack methods have been proposed. Among those methods, an extremely sly type of attack named the one-pixel attack can mislead DNNs to misclassify an image via only modifying one pixel of the image, leading to severe security threats to DNN-based information systems. Currently, no method can really detect the one-pixel attack, for which the blank will be filled by this paper. This paper proposes two detection methods, including trigger detection and candidate detection. The trigger detection method analyzes the vulnerability of DNN models and gives the most suspected pixel that is modified by the one-pixel attack. The candidate detection method identifies a set of most suspected pixels using a differential evolution-based heuristic algorithm. The real-data experiments show that the trigger detection method has a detection success rate of 9.1%, and the candidate detection method achieves a detection success rate of 30.1%, which can validate the effectiveness of our methods.

Details

Title
Detection Mechanisms of One-Pixel Attack
Author
Wang, Peng 1 ; Cai, Zhipeng 1   VIAFID ORCID Logo  ; Kim, Donghyun 1 ; Li, Wei 1 

 Department of Computer Science, Georgia State University, Atlanta 30303, USA 
Editor
Wenzhong Li
Publication year
2021
Publication date
2021
Publisher
John Wiley & Sons, Inc.
e-ISSN
15308677
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2497887418
Copyright
Copyright © 2021 Peng Wang et al. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.