Journal of Computer Applications ›› 2021, Vol. 41 ›› Issue (9): 2489-2495.DOI: 10.11772/j.issn.1001-9081.2020111863

Special Issue: 人工智能

• Artificial intelligence • Previous Articles     Next Articles

Syntax-enhanced semantic parsing with syntax-aware representation

XIE Defeng, JI Jianmin   

  1. School of Computer Science and Technology, University of Science and Technology of China, Hefei Anhui 230027, China
  • Received:2020-11-27 Revised:2021-01-15 Online:2021-09-10 Published:2021-05-12
  • Supported by:
    This work is partially supported by the Major Program for Technological Innovation 2030 - "New Generation Artificial Intelligence" (2018AAA0100500), the Guangdong Province Science and Technology Program (2017B010110011).

融入句法感知表示进行句法增强的语义解析

谢德峰, 吉建民   

  1. 中国科学技术大学 计算机科学与技术学院, 合肥 230027
  • 通讯作者: 吉建民
  • 作者简介:谢德峰(1994-),男,广东汕头人,硕士研究生,主要研究方向:语义分析、机器阅读理解、命名实体识别;吉建民(1984-),男,甘肃陇西人,副教授,博士,CCF会员,主要研究方向:自然语言处理、深度强化学习、认知机器人。
  • 基金资助:
    科技创新2030—“新一代人工智能”重大项目(2018AA000500);广东省科技计划项目(2017B010110011)。

Abstract: Syntactic information, which is syntactic structure relations or dependency relations between words of a complete sentence, is an important and effective reference in Natural Language Processing (NLP). The task of semantic parsing is to directly transform natural language sentences into semantically complete and computer-executable languages. In previous semantic parsing studies, there are few efforts on improving the efficiency of end-to-end semantic parsing by using syntactic information from input sources. To further improve the accuracy and efficiency of the end-to-end semantic parsing model, a semantic parsing method was proposed to utilize the source-side dependency relation information of syntax to improve the model efficiency. As the basic idea of the method, an end-to-end dependency relation parser was pre-trained firstly. Then, the middle representation of the parser was used as syntax-aware representation, which was spliced with the original word embedding representation to generate a new input embedding representation, and this obtained input embedding representation was used in the end-to-end semantic parsing model. Finally, the model fusion was carried out by the transductive fusion learning. In the experiments, the proposed model was compared with the baseline model Transformer and the related works in the past decade. Experimental results show that, on ATIS, GEO and JOBS datasets, the semantic parsing model integrating dependency syntax-aware representation and transductive fusion learning achieves the best accuracy of 89.1%, 90.7%, and 91.4% respectively, which exceeds the performance of the Transformer. It verifies the effectiveness of introducing the dependency relation information of syntax.

Key words: syntax-aware, semantic parsing, deep learning, Natural Language Processing (NLP), language model

摘要: 在自然语言处理(NLP)中,句法信息是完整句子中词汇与词汇之间的句法结构关系或者依存关系,是一种重要且有效的参考信息。语义解析任务是将自然语言语句直接转化成语义完整的、计算机可执行的语言。在以往的语义解析研究中,少有采用输入源的句法信息来提高端到端语义解析效率的工作。为了进一步提高端到端语义解析模型的准确率和效率,提出一种利用输入端句法依存关系信息来提高模型效率的语义解析方法。该方法的基本思路是先对一个端到端的依存关系解析器进行预训练;然后将该解析器的中间表示作为句法感知表示,与原有的字词嵌入表示拼接到一起以产生新的输入嵌入表示,并将得到的输入嵌入表示用于端到端语义解析模型;最后采用转导融合学习方式进行模型融合。实验对比了所提模型和基准模型Transformer以及过去十年的相关工作。实验结果表明,在ATIS、GEO、JOBS数据集上,融入依存句法信息感知表示以及转导融合学习的语义解析模型分别实现了89.1%、90.7%、91.4%的最佳准确率,全面超过了Transformer,验证了引入句法依存关系信息的有效性。

关键词: 句法感知, 语义解析, 深度学习, 自然语言处理, 语言模型

CLC Number: