[1] 黄波, 刘传才. 基于加权TextRank的中文自动文本摘要[J]. 计算机应用研究,2020,37(2):407-410.(HUANG B,LIU C C. Chinese automatic text summarization based on weighted TextRank[J]. Application Research of Computers,2020,37(2):407-410.) [2] GAMBHIR M,GUPTA V. Recent automatic text summarization techniques:a survey[J]. Artificial Intelligence Review,2017,47(1):1-66. [3] ERKAN G,RADEV D R. LexRank:graph-based lexical centrality as salience in text summarization[J]. Journal of Artificial Intelligence Research,2004,22(1):457-479. [4] RUSH A M,CHOPRA S,WESTON J. A neural attention model for abstractive sentence summarization[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA:Association for Computational Linguistics, 2015:379-389. [5] SHARMA E,HUANG L,HU Z,et al. An entity-driven framework for abstractive summarization[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing/the 9th International Joint Conference on Natural Language Processing. Stroudsburg, PA:Association for Computational Linguistics,2019:3280-3291. [6] ZHANG J,ZHAO Y,SALEH M,et al. PEGASUS:pre-training with extracted gap-sentences for abstractive summarization[C]//Proceedings of the 37th International Conference on Machine Learning. La Jolla,CA:International Machine Learning Society, 2020:11328-11339. [7] SAITO I, NISHIDA K, NISHIDA K, et al. Abstractive summarization with combination of pre-trained sequence-tosequence and saliency models[EB/OL].[2020-04-12]. https://arxiv.org/pdf/2003.13028v1.pdf. [8] LI S,LEI D,QIN P,et al. Deep reinforcement learning with distributional semantic rewards for abstractive summarization[C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing/the 9th International Joint Conference on Natural Language Processing. Stroudsburg,PA:Association for Computational Linguistics,2019:6038-644. [9] CHENG H,CAI J,FANG Y. RL-Gen:a character-level text generation framework with reinforcement learning in domain generation algorithm case[C]//Proceedings of the 26th International Conference on Neural Information Processing,CCIS 1143. Cham:Springer,2019:690-697. [10] ZHANG T,KISHORE V,WU F,et al. BERTScore:evaluating text generation with BERT[EB/OL].[2020-03-07]. https://arxiv.org/pdf/1904.09675v3.pdf. [11] HERMANN K M,KOČISKÝ T,GREFENSTETTE E,et al. Teaching machines to read and comprehend[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,2015:1693-1701. [12] LIN C Y. ROUGE:a package for automatic evaluation of summaries[C]//Text Summarization Branches Out:Proceedings of the ACL-04 Workshop. Stroudsburg, PA:Association for Computational Linguistics,2004:74-81. [13] NARAYAN S,COHEN S B,LAPATA M. Ranking sentences for extractive summarization with reinforcement learning[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg,PA:Association for Computational Linguistics,2018:1747-1759. [14] CHEN Y,MA Y,MAO X,et al. Abstractive summarization with the aid of extractive summarization[C]//Proceedings of the 2018 Asia-Pacific Web (APWeb) and Web-Age Information Management(WAIM)Joint International Conference on Web and Big Data,LNCS 10987. Cham:Springer,2018:3-15. [15] KIM Y. Convolutional neural networks for sentence classification[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg,PA:Association for Computational Linguistics,2004:1746-1751. [16] CHUNG J,GULCEHRE C,CHO K,et al. Empirical evaluation of gated recurrent neural networks on sequence modeling[EB/OL].[2018-09-07]. https://arxiv.org/pdf/1412.3555v1.pdf. [17] VINYALS O,FORTUNATO M,JAITLY N. Pointer networks[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,2015:2692-2700. [18] BAHDANAU D,CHO K,BENGIO Y. Neural machine translation by jointly learning to align and translate[EB/OL].[2018-09-20]. https://arxiv.org/pdf/1409.0473v7.pdf. [19] GU J,LU Z,LI H,et al. Incorporating copying mechanism in sequence-to-sequence learning[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics, 2016:1631-1640. [20] WILLIAMS R J. Simple statistical gradient-following algorithms for connectionist reinforcement learning[J]. Machine Learning, 1992,8(3/4):229-256. [21] MIKOLOV T,CHEN K,CORRADO G,et al. Efficient estimation of word representations in vector space[EB/OL].[2018-07-08]. https://arxiv.org/pdf/1301.3781v3.pdf. [22] KINGMA D P, BA J L. Adam:a method for stochastic optimization[EB/OL].[2018-07-21]. https://arxiv.org/pdf/1412.6980v9.pdf. [23] LIU Y,TITOV I,LAPATA M. Single document summarization as tree induction[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg,PA:Association for Computational Linguistics,2019:1745-1755. [24] SEE A, LIU P J, MANNING C D. Get to the point:summarization with pointer-generator networks[C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA:Association for Computational Linguistics,2017:1073-1083. [25] NALLAPATI R,ZHAI F,ZHOU B. SummaRuNNer:a recurrent neural network based sequence model for extractive summarization of document[C]//Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence. Palo Alto,CA:AAAI Press,2017:3075-3081. [26] BAE S,KIM T,KIM J,et al. Summary level training of sentence rewriting for abstractive summarization[C]//Proceedings of the 2nd Workshop on New Frontiers in Summarization. Stroudsburg, PA:Association for Computational Linguistics,2019:10-20. [27] HAN S,LIN X,JOTY S. Resurrecting submodularity in neural abstractive summarization[EB/OL].[2020-10-10]. https://arxiv.org/pdf/1911.03014v3.pdf. |