[1] YU A W, DOHAN D, LUONG M T, et al. QANet:combining local convolution with global self-attention for reading comprehension[EB/OL].[2019-03-23]. https://arxiv.org/pdf/1804.09541.pdf. [2] XIAO L, WANG N, YANG G. A reading comprehension style question answering model based on attention mechanism[C]//Proceedings of the IEEE 29th International Conference on Application-specific Systems, Architectures and Processors. Piscataway, NJ:IEEE, 2018:1-4. [3] CHEN K, ZHAO T, YANG M, et al. A neural approach to source dependence based context model for statistical machine translation[J]. IEEE/ACM Transactions on Audio, Speech, and Language Processing, 2017, 26(2):266-280. [4] GAMBHIR M, GUPTA V. Recent automatic text summarization techniques:a survey[J]. Artificial Intelligence Review, 2017, 47(1):1-66. [5] LOPYREV K. Generating news headlines with recurrent neural networks[EB/OL].[2019-03-15]. https://arxiv.org/pdf/1512.01712.pdf. [6] RUSH A M, CHOPRA S, WESTON J, et al. A neural attention model for abstractive sentence summarization[EB/OL].[2019-03-13]. https://arxiv.org/pdf/1509.00685.pdf. [7] HU B, CHEN Q, ZHU F, et al. LCSTS:a large scale Chinese short text summarization dataset[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA:Association for Computational Linguistics, 2015:1967-1972. [8] ZHANG H, LI J, JI Y, et al. Understanding subtitles by character-level sequence-to-sequence learning[J]. IEEE Transactions on Industrial Informatics, 2017, 13(2):616-624. [9] SEE A, LIU P J, MANNING C D, et al. Get to the point:summarization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics, 2017:1073-1083. [10] CHEN G. Chinese short text summary generation model integrating multi-level semantic information[C]//Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering. Paris:Atlantis Press, 2018:1-12. [11] 张克君,李伟男,钱榕,等.基于深度学习的文本自动摘要方案[J].计算机应用,2019,39(2):311-315.(ZHANG K J, LI W N, QIAN R, et al. Automatic text summarization scheme based on deep learning[J]. Journal of Computer Applications, 2019, 39(2):311-315.) [12] 沈华东,彭敦陆.AM-BRNN:一种基于深度学习的文本摘要自动抽取模型[J].小型微型计算机系统,2018,39(6):1184-1189.(SHEN H D, PENG D L. AM-BRNN:automatic text summarization extraction model based on deep learning[J]. Journal of Chinese Computer Systems, 2018, 39(6):1184-1189.) [13] 李娜娜,刘培玉,刘文锋,等.基于TextRank的自动摘要优化算法[J].计算机应用研究,2019,36(4):1045-1050.(LI N N, LIU P Y, LIU W F, et al. Automatic digest optimization algorithm based on TextRank[J]. Application Research of Computers, 2019, 36(4):1045-1050.) [14] 庞超,尹传环.基于分类的中文文本摘要方法[J].计算机科学,2018,45(1):144-147,178.(PANG C, YIN C H. Chinese text summarization based on classification[J]. Computer Science, 2018, 45(1):144-147, 178.) [15] GEHRING J, AULI M, GRANGIER D, et al. Convolutional sequence to sequence learning[C]//Proceedings of the 2017 International Conference on Machine Learning. Berkeley:USENIX Association, 2017:1243-1252. [16] YIN Z, SHEN Y Y. On the dimensionality of word embedding[C]//Proceedings of the 2018 Neural Information Processing Systems Conference. Vancouver:NeurIPS, 2018:887-898. [17] LIN C Y, HOVY E. Automatic evaluation of summaries using n-gram co-occurrence statistics[C]//Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology. Stroudsburg:Association for Computational Linguistics, 2003:71-78. |