JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE) ›› 2019, Vol. 54 ›› Issue (1): 60-66.doi: 10.6040/j.issn.1671-9352.3.2018.004

Previous Articles    

Comparative study on neural network structures in power analysis

LIU Biao, LU Zhe, HUANG Yu-wei, JIAO Meng, LI Quan-qi, XUE Rui   

  1. Management Department, Beijing Electronic Science and Technology Institution, Beijing 100071, China
  • Published:2019-01-23

Abstract: In order to explore the difference of different neural networks used in power analysis, we use DPA_Contest_V4 dataset to complete our experiment. After the mask is cracked, the deep neural network and the traditional machine learning like SVM are firstly used. Then, the impact of the changes in the structure of the neural network model on the experimental results is analyzed. Finally, a comprehensive comparison of different network models is made by combining the cyclic neural network. The experimental results show that the neural network model is superior to the traditional machine learning model and the recurrent neural network model is superior to the deep neural network model when experimental conditions are the same. Among them, the activation functions of neural networks with different layers are different, which will lead to great changes in the experimental results.

Key words: deep learning, neural network, power analysis

CLC Number: 

  • TP309.2
[1] LERMAN L, POUSSIER R, BONTEMPI G, et al. Template attacks vs. machine learning revisited(and the curse of dimensionality in side-channel analysis)[C] // Constructive Side-Channel Analysis and Secure Design. Berlin: Springer International Publishing, 2015: 20-33.
[2] GIOMORE R, HANLEY N, et al. Neural network based attack on a masked implementation of AES[C] // IEEE International Symposium on Hardware Oriented Security & Trust. Belfast: IEEE, 2015: 106-111.
[3] MARTINASEK Z, ZAPLETAL O, et al. Power analysis attack based on the MLP in DPA Contest V4[C] // International Conference on Telecommunications & Signal Processing. Kerala: IEEE, 2016: 223-226.
[4] SIMON H. 神经网络与机器学习[M].申富饶,等译.北京:机械工业出版社,2017.
[5] 伊恩·古德费洛,约书亚·本吉奥,亚伦·库维尔. 深度学习[M]. 赵申剑,等译.北京:人民邮电出版社, 2017.
[6] Telecom Paristech Sen Research Group. DPA Contest(4th edition),2013-2014[EB/OL].[2014-5-12].http://www.dpacontest.org/V4/.
[7] NASSAR M,SOUISSI Y, GUILLEY S, et al. RSM: a small and fast countermeasure for AES, secure against 1st and 2nd-order zero-offset SCAs[C] // Design, Automation & Test in Europe Conference and Exposition. Dresden: IEEE Computer Society, 2012: 1173-1178.
[8] RECHBERGER C, OSWALD E. Practical template attacks[J]. Lecture Notes in Computer Science, 2005, 3325(3):440-456.
[9] 邓高明, 张鹏, 赵强,等. 基于PCA和SVM的电磁模板分析攻击[J]. 计算机测量与控制, 2009, 17(9):1837-1839.
[10] ZENG Z, GU D, LIU J, et al. An improved side-channel attack based on support vector machine[C] // Tenth International Conference on Computational Intelligence and Security. Liverpool: IEEE, 2014: 676-680.
[1] XIAO Wei-ming, WANG Gui-jun. Design and approximation of SISO three layers feedforward neural network based on Bernstein polynomials [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(9): 55-61.
[2] PANG Bo, LIU Yuan-chao. Fusion of pointwise and deep learning methods for passage ranking [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(3): 30-35.
[3] LIU Ming-ming, ZHANG Min-qing, LIU Jia, GAO Pei-xian. Steganalysis method based on shallow convolution neural network [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(3): 63-70.
[4] LI Cui-ping, GAO Xing-bao. A neural network for solving l1-norm problems with constraints [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2018, 53(12): 90-98.
[5] QIN Jing, LIN Hong-fei, XU Bo. Music retrieval model based on semantic descriptions [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2017, 52(6): 40-48.
[6] WANG Chang-hong, WANG Lin-shan. Mean square exponential stability of memristor-based stochastic neural networks with S-type distributed delays [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2016, 51(5): 130-135.
[7] ZHEN Yan, WANG Lin-shan. Mean square exponential stability analysis of stochastic generalized cellular neural networks with S-type distributed delays [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(12): 60-65.
[8] YANG Yang, LIU Long-fei, WEI Xian-hui, LIN Hong-fei. New methods for extracting emotional words based on distributed representations of words [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(11): 51-58.
[9] LIU Ming, ZAN Hong-ying, YUAN Hui-bin. Key sentiment sentence prediction using SVM and RNN [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(11): 68-73.
[10] DU Xi-hua, SHI Xiao-qin, FENG Chang-jun, LI Liang. rediction of chromatograph retention index by artificial neural  network by #br# study on volatile constituents of wild chinese chives [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(1): 50-53.
[11] MA Kui-sen, WANG Lin-shan*. Exponential synchronization of stochastic BAM neural networks with#br# S-type distributed delays [J]. JOURNAL OF SHANDONG UNIVERSITY(NATURAL SCIENCE), 2014, 49(03): 73-78.
[12] CHANG Qing, ZHOU Li-qun*. Global uniform asymptotic stability of a class of delayed cellular neural networks [J]. J4, 2012, 47(8): 42-49.
[13] FENG Xin-ying1,2, JI Hua1,2, ZHANG Hua-xiang1,2. Multi-label RBF neural networks learning algorithm  based on clustering optimization [J]. J4, 2012, 47(5): 63-67.
[14] DUAN Chen-xia1, SUN Gang2, WANG Gui-jun1*. Universal approximation of three-layer regular fuzzy neural networks for a class of functions [J]. J4, 2012, 47(3): 81-86.
[15] ZHANG Wei-wei1, WANG Lin-shan2*. Global exponential robust stability of stochastic interval cellularneural networks with S-type distributed delays [J]. J4, 2012, 47(3): 87-92.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!