Abstract

Although the classification method based on the deep neural network has achieved excellent results in classification tasks, it is difficult to apply to real-time scenarios because of high memory footprints and prohibitive inference times. Compared to unstructured pruning, structured pruning techniques can reduce the computation cost of the model runtime more effectively, but inevitably reduces the precision of the model. Traditional methods use fine tuning to restore model damage performance. However, there is still a large gap between the pruned model and the original one. In this paper, we use progressive multi-level distillation learning to compensate for the loss caused by pruning. Pre-pruning and post-pruning networks serve as the teacher and student networks. The proposed approach utilizes the complementary properties of structured pruning and knowledge distillation, which allows the pruned network to learn the intermediate and output representations of the teacher network, thus reducing the influence of the model subject to pruning. Experiments demonstrate that our approach performs better on CIFAR-10, CIFAR-100, and Tiny-ImageNet datasets with different pruning rates. For instance, GoogLeNet can achieve near lossless pruning on the CIFAR-10 dataset with 60% pruning. Moreover, this paper also proves that using the proposed distillation learning method during the pruning process achieves more significant performance gains than after completing the pruning.

Details

Title
Progressive multi-level distillation learning for pruning network
Author
Wang, Ruiqing 1 ; Wan, Shengmin 1 ; Zhang, Wu 2 ; Zhang, Chenlu 1 ; Li, Yu 1 ; Xu, Shaoxiang 1 ; Zhang, Lifu 1 ; Jin, Xiu 2 ; Jiang, Zhaohui 2 ; Rao, Yuan 2 

 Anhui Agricultural University, School of Information and Computer, Hefei, China (GRID:grid.411389.6) (ISNI:0000 0004 1760 4804) 
 Anhui Agricultural University, School of Information and Computer, Hefei, China (GRID:grid.411389.6) (ISNI:0000 0004 1760 4804); Anhui Agriculture University, Anhui Province Key Laboratory of Smart Agricultural Technology and Equipment, Hefei, China (GRID:grid.411389.6) (ISNI:0000 0004 1760 4804) 
Pages
5779-5791
Publication year
2023
Publication date
Oct 2023
Publisher
Springer Nature B.V.
ISSN
21994536
e-ISSN
21986053
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2867416578
Copyright
© The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.