Content area
This paper focuses on sorting out two backbone network variants’ development routes for CNNs and transformers based on deep learning methods for side-channel analysis. Combining parametric and quantitative structural analyses, it is preferred that improvement research be carried out on the structure of the continuation CNN variants. Firstly, the comparative reproduction of Zaid’s Efficient CNN and Wouters’ Simplified Efficient CNN will be carried out; second, ResNet will be designed and implemented based on residual structure and the improved design of feature-coded CNNs; finally, the four methods in DPA_ contest v4.1, AES_RD, AES_HD, and ASCAD public datasets will be used for extensive analysis experiments to explore the effects of six preprocessing roles at three scales and further explore the impact of data enhancement by dataset noise, offset, and amplitude scaling. The experimental results show that ResNet based on the residual structure and the coded feature CNN proposed in this paper exhibit better performance advantages than the previous methods. Preferring a preprocessing method for each dataset can continue to reduce the mean rank; continuing to overlay data enhancement methods can make the model easier to converge and increase the model’s generalization ability. To advance the research in the branch of CNNs and transformer variants in the SCA field, the model methods obtained from the above experiments and the datasets obtained from the processing have been made publicly available on GitHub.
Details
; Li, Wenchang 2 ; Cao Xiaodong 1 ; Fu Yihao 3 ; Li, Xiang 1 ; Liu, Jian 4
; Chen, Aidong 3 ; Zhang, Yanlong 5 ; Wang, Shuo 5 ; Zhou, Jing 5 1 Artificial Intelligence and High-Speed Circuits Laboratory, Institute of Semiconductors, Chinese Academy of Sciences, Beijing 100083, China; [email protected] (W.L.); [email protected] (X.C.); [email protected] (X.L.), University of Chinese Academy of Sciences, Beijing 100049, China
2 Key Laboratory of Solid-State Optoelectronic Information Technology, Institute of Semiconductors, Chinese Academy of Sciences, Beijing 100083, China, College of Microelectronics, University of Chinese Academy of Sciences, Beijing 100049, China
3 Multi-Agent Systems Research Center, School of Robotics, Beijing Union University, Beijing 100101, China; [email protected] (Y.F.); [email protected] (A.C.)
4 State Key Laboratory of Semiconductor Physics and Chip Technologies, Institute of Semiconductors, Chinese Academy of Sciences, Beijing 100083, China; [email protected]
5 Beijing Institute of Microelectronics Technology, Beijing 100076, China; [email protected] (Y.Z.); [email protected] (S.W.); [email protected] (J.Z.)