Journal of Computer Applications ›› 2020, Vol. 40 ›› Issue (8): 2202-2206.DOI: 10.11772/j.issn.1001-9081.2019122154

• Artificial intelligence • Previous Articles     Next Articles

Aspect-based sentiment analysis with self-attention gated graph convolutional network

CHEN Jiawei, HAN Fang, WANG Zhijie   

  1. College of Information Science and Technology, Donghua University, Shanghai 201620, China
  • Received:2019-12-24 Revised:2020-02-27 Online:2020-08-10 Published:2020-05-13
  • Supported by:
    This work is partially supported by the National Natural Science Foundation of China (11972115, 11572084).

基于自注意力门控图卷积网络的特定目标情感分析

陈佳伟, 韩芳, 王直杰   

  1. 东华大学 信息科学与技术学院, 上海 201620
  • 通讯作者: 韩芳(1981-),女,山西朔州人,教授,博士,主要研究方向:智能系统、神经动力学,yadiahan@dhu.edu.cn
  • 作者简介:陈佳伟(1994-),男,江苏南通人,硕士研究生,主要研究方向:自然语言处理、机器学习;王直杰(1969-),男,浙江台州人,教授,博士,主要研究方向:神经网络、数字信号处理、工业控制软件。
  • 基金资助:
    国家自然科学基金资助项目(11972115,11572084)。

Abstract: Aspect-based sentiment analysis tries to estimate different emotional tendencies expressed in different aspects of a sentence. Aiming at the problem that the existing network model based on Recurrent Neural Network (RNN) combined with attention mechanism has too many training parameters and lacks explanation of related syntax constraints and long distance word dependence mechanism, a self-attention gated graph convolutional network was proposed, namely MSAGCN. First, the multi-headed self-attention mechanism was used to encode context words and targets, thus capturing semantic associations within the sentence. Then, a graph convolutional network was established on the sentence's dependency tree to obtain syntactic information and word dependencies. Finally, the sentiment of the specific target was obtained through the GTRU (Gated Tanh-ReLU Unit). Compared with the baseline model, the proposed model has the accuracy and F1 improved by 1%-3.3% and 1.4%-6.3% respectively. At the same time, the pre-trained Bidirectional Encoder Representations from Transformers (BERT) model was also applied to the current task to further improve the model effect. Experimental results verify that the proposed model can better grasp the emotional tendencies of user reviews.

Key words: aspect-based sentiment analysis, self-attention mechanism, Graph Convolutional Network (GCN), gating mechanism, Bidirectional Encoder Representations from Transformers (BERT)

摘要: 基于特定目标的情感分析旨在预测句子中不同方面表达的不同情感倾向。针对之前利用循环神经网络(RNN)结合注意力机制的网络模型所带来的训练参数多且缺少对相关句法约束和长距离词依赖机制解释的问题,提出自注意力门控图卷积网络MSAGCN。首先,模型采用多头自注意力机制编码上下文词和目标,捕获句子内部的语义关联;然后,采用在句子的依存树上建立图卷积网络的方法获取句法信息以及词的依存关系;最后,通过带有目标嵌入的门控单元(GTRU)获取特定目标的情感。与基线模型相比,所提模型的准确率和调和平均值F1分别提高了1%~3.3%和1.4%~6.3%;同时,预训练的BERT模型也被应用到当前任务中,使模型效果获得了新的提升。实验结果表明所提出的模型能更好掌握用户评论的情感倾向。

关键词: 特定目标情感分析, 自注意力机制, 图卷积网络, 门控机制, BERT

CLC Number: