[1] SEE A,LIU P J,MANNING C D. Get to the point:summarization with pointer-generator networks[C]//Proceedings of the 2017 55th Annual Meeting of the Association for Computational Linguistics. Stroudsburg:ACL,2017:1073-1083. [2] VASWANI A,SHAZEER N,PARMAR N,et al. Attention is all you need[C]//Proceedings of the 201731st International Conference on Neural Information Processing Systems. Red Hook:Curran Associates Inc.,2017:6000-6010. [3] LEE H,CHOI Y,LEE J H. Attention history-based attention for abstractive text summarization[C]//Proceedings of the 2020 35th Annual ACM Symposium on Applied Computing. New York:ACM, 2020:1075-1081. [4] TU Z,LU Z,LIU Y,et al. Modeling coverage for neural machine translation[C]//Proceedings of the 2016 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg:ACL, 2016:76-85. [5] 巩轶凡, 刘红岩, 何军, 等. 带有覆盖率机制的文本摘要模型研究[J]. 计算机科学与探索, 2019, 13(2):205-213.(GONG Y F,LIU H Y,HE J,et al. Research on text summarization model with coverage mechanism[J]. Journal of Frontiers of Computer Science and Technology,2019,13(2):205-213.) [6] GU J,LU Z,LI H,et al. Incorporating copying mechanism in sequence-to-sequence learning[C]//Proceedings of the 2016 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg:ACL,2016:1631-1640. [7] VINYALS O,FORTUNATO M,JAITLY N. Pointer networks[C]//Proceedings of the 2015 28th International Conference on Neural Information Processing Systems. Cambridge:MIT Press,2015:2692-2700. [8] GULCEHRE C,AHN S,NALLAPATI R,et al. Pointing the unknown words[C]//Proceedings of the 2016 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg:ACL,2016:140-149. [9] MERITY S,XIONG C,BRADBURY J,et al. Pointer sentinel mixture models[EB/OL].[2020-09-20]. https://arxiv.org/pdf/1609.07843.pdf. [10] MI H,SANKARAN B,WANG Z,et al. Coverage embedding models for neural machine translation[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL,2016:955-960. [11] SANKARAN B, MI H, Al-ONAIZAN Y, et al. Temporal attention model for neural machine translation[EB/OL].[2020-09-20]. https://arxiv.org/pdf/1608.02927.pdf. [12] SUZUKI J,NAGATA M. RNN-based encoder-decoder approach with word frequency estimation[EB/OL].[2020-09-20]. https://arxiv.org/pdf/1701.00138v1.pdf. [13] KOEHN P. Pharaoh:a beam search decoder for phrase-based statistical machine translation models[C]//Proceedings of the 2004 6th Conference of the Association for Machine Translation in the Americas,LNCS 3265. Berlin:Springer,2004:115-124. [14] LIN C Y,HOVY E. Automatic evaluation of summaries using Ngram co-occurrence statistics[C]//Proceedings of the 2003 Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics. Stroudsburg:ACL,2003:71-78. [15] HU B,CHEN Q,ZHU F. LCSTS:a large scale Chinese short text summarization dataset[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL,2015:1967-1972. |