Journal of Computer Applications ›› 2018, Vol. 38 ›› Issue (7): 1831-1838.DOI: 10.11772/j.issn.1001-9081.2017123009

    Next Articles

Semantic relation extraction model via attention based neural Turing machine

ZHANG Runyan1, MENG Fanrong1, ZHOU Yong1, LIU Bing1,2   

  1. 1. School of Computer Science and Technology, China University of Mining and Technology, Xuzhou Jiangsu 221116, China;
    2. Institute of Electrics, Chinese Academy of Sciences, Beijing 100080, China
  • Received:2017-12-22 Revised:2018-02-09 Online:2018-07-10 Published:2018-07-12
  • Supported by:
    This work is partially supported by the Surface Program of National Natural Science Foundation of China (61572505).


张润岩1, 孟凡荣1, 周勇1, 刘兵1,2   

  1. 1. 中国矿业大学 计算机科学与技术学院, 江苏 徐州 221116;
    2. 中国科学院 电子研究所, 北京 100080
  • 通讯作者: 孟凡荣
  • 作者简介:张润岩(1994-),男,北京人,硕士研究生,主要研究方向:神经网络、自然语言处理;孟凡荣(1962-),女,辽宁沈阳人,教授,博士生导师,博士,主要研究方向:智能信息处理、数据库技术、数据挖掘;周勇(1974-),男,江苏徐州人,教授,博士生导师,博士,主要研究方向:数据挖掘、无线传感器网络;刘兵(1981-),男,河南永城人,副教授,博士,主要研究方向:机器学习、模式识别。
  • 基金资助:

Abstract: Focusing on the problem of poor memory in long sentences and the lack of core words' influence in semantic relation extraction, an Attention based bidirectional Neural Turing Machine (Ab-NTM) model was proposed. Instead of a Recurrent Neural Network (RNN), a Neural Turing Machine (NTM) was used firstly, and a Long Short-Term Memory (LSTM) network was acted as a controller, which contained larger and non-interfering storage, and it could hold longer memories than the RNN. Secondly, an attention layer was used to organize the context information on the word level so that the model could pay attention to the core words in sentences. Finally, the labels were gotten through the classifier. Experiments on the SemEval-2010 Task 8 dataset show that the proposed model outperforms most state-of-the-art methods with an 86.2% F1-score.

Key words: Natural Language Processing (NLP), semantic relation extraction, Recurrent Neural Network (RNN), bidirectional Neural Turing Machine (NTM), attention mechanism

摘要: 针对语义关系抽取(语义关系分类)中长语句效果不佳和核心词表现力弱的问题,提出了一种基于词级注意力的双向神经图灵机(Ab-NTM)模型。首先,使用神经图灵机(NTM)作为循环神经网络(RNN)的改进,使用长短时记忆(LSTM)网络作为控制器,其互不干扰的存储特性可加强模型在长语句上的记忆能力;然后,构建注意力层组织词级上下文信息,使模型可以加强句中核心词的表现力;最后,输入分类器得到语义关系标签。在SemEval 2010 Task 8公共数据集上的实验表明,该模型获得了86.2%的得分,优于其他方法。

关键词: 自然语言处理, 语义关系抽取, 循环神经网络, 双向神经图灵机, 注意力机制

CLC Number: