[1] 黄恒琪, 于娟, 廖晓, 等. 知识图谱研究综述[J]. 计算机系统应用,2019, 28(6):1-12.(HUANG H Q,YU J,LIAO X,et al. Review on knowledge graphs[J]. Computer Systems & Applications,2019,28(6):1-12.) [2] KAMBHATLA N. Combining lexical, syntactic, and semantic features with maximum entropy models for extracting relations[C]//Proceedings of the ACL 2004 on interactive Poster and Demonstrational Session. Stroudsburg:ACL,2004:22. [3] BUNESCU R C,MOONEY R J. A shortest path dependency kernel for relation extraction[C]//Proceedings of the 2005 Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL, 2005:724-731. [4] ZENG D, LIU K, LAI S, et al. Relation classification via convolutional deep neural network[C]//Proceedings of the 25th International Conference on Computational Linguistics:Technical Papers. Dublin,Ireland:Dublin City University,2014:2335-2344. [5] ZHANG D,WANG D. Relation classification via recurrent neural network[EB/OL]. (2015-11-25)[2020-06-16]. https://arxiv.org/pdf/1508.01006.pdf. [6] SANTOS C N D,XIANG B,ZHOU B. Classifying relations by ranking with convolutional neural networks[EB/OL]. (2015-05-24)[2020-05-14]. http://www.arxiv.org/pdf/1504.06580.pdf. [7] WANG L,CAO Z,MELO G,et al. Relation classification via multi-level attention CNNs[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg:ACL,2016:1298-1307. [8] ZHANG X,CHEN F,HUANG R. A combination of RNN and CNN for attention-based relation classification[J]. Procedia Computer Science,2018,131:911-917. [9] TANG L,TENG F,MA Z,et al. Convolutional LSTM network with hierarchical attention for relation classification in clinical texts[C]//Proceedings of the 2019 International Joint Conference on Neural Networks. Piscataway:IEEE,2019:1-8. [10] XU Y,MOU L,LI G,et al. Classifying relations via long short term memory networks along shortest dependency paths[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL,2015:1785-1794. [11] GUO X,ZHANG H,YANG H,et al. A single attention-based combination of CNN and RNN for relation classification[J]. IEEE Access,2019,7:12467-12475. [12] MIWA M, BANSAL M. End-to-end relation extraction using LSTMs on sequences and tree structures[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg:ACL,2016:1105-1116. [13] ZHANG Y,QI P,MANNING C D. Graph convolution over pruned dependency trees improvs relation extraction[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL,2018:2205-2215. [14] CHENG J P,DONG L,LAPATA M. Long short-term memorynetworks for machine reading[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL,2016:551-561. [15] KIPF T N,WEL M. Semi-supervised classification with graph convolutional networks[EB/OL]. (2017-02-22)[2020-05-07]. https://arxiv.org/pdf/1609.02907.pdf. [16] XU K,LI C,TIAN Y. Representation learning on graphs with jumping knowledge networks[EB/OL]. (2018-06-25)[2020-05-22]. https://arxiv.org/pdf/1806.03536.pdf. [17] 钱雪忠, 王晓霞. 基于注意力循环门控图卷积网络的关系提取方法及系统:中国, 202010850462[P]. 2020-11-24. (QIAN X Z, WANG X X. Relation extraction method and system based on attention cycle gated graph convolution network, China:202010850462[P]. 2020-11-24.) [18] ASHISH V,NOAM S,NIKI P,et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing. Stroudsburg:ACL,2017:5998-6008. [19] MARCHEGGIANI D,TITOV I. Encoding sentences with graph convolutional networks for semantic role labeling[C]//Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Stroudsburg:ACL,2017:1506-1515. [20] ZHOU P,SHI W,TIAN J,et al. Attention-based bidirectional long short-term memory network for relation classification[C]//Proceeding of 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg:ACL,2016:207-212. [21] LEE J,SANGW S,SUK C. Semantic relation classification via bidirectional LSTM networks with entity-aware attention using latent entity typing[J]. Symmetry,2019,11(6):785. [22] PENG D,ZHANG D,LIU C,et al. BG-SAC:entity relationship classification model based on self-attention supported Capsule networks[J]. Applied Soft Computing,2020,91:106186. |