1 KRIZHEVSKY A , SUTSKEVER I , HINTON G E . ImageNet classification with deep convolutional neural networks[C]// Proceedings of the 25th International Conference on Neural Information Processing Systems. New York: Curran Associates Inc., 2012: 1097-1105.
2 SIMONYAN K , ZISSERMAN A . Very deep convolutional networks for large-scale image recognition[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1409.1556.pdf.
3 SZEGEDY C , LIU W , JIA Y , et al . Going deeper with convolutions[C]// Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2015: 1-9.
4 彭冬亮,王天兴 . 基于GoogLeNet模型的剪枝算法[J]. 控制与决策, 2019, 34(6):1259-1264. (PENG D L, WANG T X. Pruning algorithm based on GoogLeNet model[J]. Control and Decision, 2019, 34(6):1259-1264.)
5 HE K , ZHANG X , REN S , et al . Deep residual learning for image recognition[C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2016:770-778.
6 HUANG G , LIU Z , MAATEN L VAN DER , et al . Densely connected convolutional networks[C]// Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2017: 2261-2269.
7 WANG J , ZHANG J , BAO W , et al . Not just privacy: improving performance of private deep learning in mobile cloud[C]// Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2018: 2407-2416.
8 ZHU M H , GUPTA S . To prune, or not to prune : exploring the efficacy of pruning for model compression[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1710.01878.pdf.
9 WANG H , ZHANG Q , WANG Y , et al . Structured deep neural network pruning by varying regularization parameters[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1804.09461.pdf.
10 JéGOU H , DOUZE M , SCHMID C . Product quantization for nearest neighbor search[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33(1):117-128.
11 HINTON G , VINYALS O , DEAN J . Distilling the knowledge in a neural network[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1503.02531.pdf.
12 HOWARD A G , ZHU M , CHEN B , et al . MobileNets: efficient convolutional neural networks for mobile vision applications[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1704.04861.pdf.
13 ZHANG X , ZHOU X , LIN M , et al . ShuffleNet: an extremely efficient convolutional neural network for mobile devices[C]// Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2018: 6848-6856.
14 DENIL M , SHAKIBI B , DINH L , et al . Predicting parameters in deep learning[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. New York: Curran Associates Inc., 2013: 2148-2156.
15 HASSIBI B , STORK D G , WOLFF G J . Optimal brain surgeon and general network pruning[C]// Proceedings of the 1993 IEEE International Conference on Neural Networks. Piscataway: IEEE, 1993: 293-299.
16 HAN S , MAO H , DALLY W J . Deep compression: compressing deep neural networks with pruning, trained quantization and Huffman coding[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1510.00149.pdf.
17 MOLCHANOV P , TYREE S , KARRAS T , et al . Pruning convolutional neural networks for resource efficient inference[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1611.06440.pdf.
18 SUN Y , WANG X , TANG X . Sparsifying neural network connections for face recognition[C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2016: 4856-4864.
19 LUO J , WU J . An entropy-based pruning method for CNN compression[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1706.05791.pdf.
20 DONG J , ZHENG H , LIAN L . Activation-based weight significance criterion for pruning deep neural networks[C]// Proceedings of the 2017 International Conference on Image and Graphics, LNCS 10667. Cham: Springer, 2017: 62-73.
21 HE Y , LIU P , WANG Z , et al . Pruning filter via geometric median for deep convolutional neural networks acceleration[EB/OL]. [2019-06-30].https://arxiv.org/pdf/1811.00250.pdf.
22 靳丽蕾,杨文柱,王思乐,等 . 一种用于卷积神经网络压缩的混合剪枝方法[J]. 小型微型计算机系统, 2018, 39(12): 2596-2601. JIN L L , YANG W Z , WANG S L , et al . Mixed pruning method for convolutional neural network compression[J]. Journal of Chinese Computer Systems, 2018, 39(12): 2596-2601.
23 靳丽蕾 . 基于剪枝的卷积神经网络压缩方法研究[D]. 保定:河北大学, 2019:10-63.(JIN L L. Research on convolution neural network compression method based on pruning[D]. Baoding: Hebei University, 2019:10-63.)
24 WANG Z , ZHU C , XIA Z , et al . Towards thinner convolutional neural networks through gradually global pruning[C]// Proceedings of the 2017 IEEE International Conference on Image Processing. Piscataway: IEEE, 2017: 3939-3943.
25 LEE N, AJANTHAN T , TORR P H S . SNIP: single-shot network pruning based on connection sensitivity[EB/OL]. [2019-06-30].http://www.robots.ox.ac.uk/~tvg/publications/2019/SNIP-ICLR-camera-ready.pdf.
26 GUO Y , YAO A , CHEN Y . Dynamic network surgery for efficient DNNs[C]// Proceedings of the 30th International Conference on Neural Information Processing Systems. New York: Curran Associates Inc., 2016: 1387-1395.
27 SRINIVAS S , BABU R V . Data-free parameter pruning for deep neural networks[C]// Proceedings of the 2015 British Machine Vision Conference. Durham: BMVA, 2015: No.31.
28 HU H , PENG R , TAI Y . Network trimming: a data-driven neuron pruning approach towards efficient deep architectures [EB/OL].https://arxiv.org/pdf/1607.03250.pdf.
29 LEE W, XIANG D . Information-theoretic measures for anomaly detection[C]// Proceedings of the 2001 IEEE Symposium on Security and Privacy. Piscataway: IEEE, 2001: 130-143.
30 LIU Y , SCHMIDT B . LightSpMV: Faster CSR-based sparse matrix-vector multiplication on CUDA-enabled GPUs[C]// 2015 IEEE 26th International Conference on Application-specific Systems, Architectures and Processors.Piscataway: IEEE, 2015: 82-89.
31 KRIZHEVSKY A . Learning multiple layers of features from tiny images[D]. Toronto, ON: University of Toronto, 2009:3-60. |