Journal of Computer Applications ›› 2019, Vol. 39 ›› Issue (8): 2198-2203.DOI: 10.11772/j.issn.1001-9081.2018122565

• Artificial intelligence • Previous Articles     Next Articles

Aspect level sentiment classification model with location weight and long-short term memory based on attention-over-attention

WU Ting, CAO Chunping   

  1. School of Optical-Electrical and Computer Engineering, University of Shanghai for Science and Technology, Shanghai 200082, China
  • Received:2019-01-02 Revised:2019-04-02 Online:2019-04-17 Published:2019-08-10
  • Supported by:
    This work is partially supported by the National Natural Science Foundation of China (61803264).

融合位置权重的基于注意力交叉注意力的长短期记忆方面情感分析模型

武婷, 曹春萍   

  1. 上海理工大学 光电信息与计算机工程学院, 上海 200082
  • 通讯作者: 武婷
  • 作者简介:武婷(1993-),女,山西太原人,硕士研究生,主要研究方向:数据挖掘、自然语言处理;曹春萍(1968-),女,甘肃兰州人,副教授,硕士,CCF会员,主要研究方向:智能数据处理、个性化服务。
  • 基金资助:
    国家自然科学基金资助项目(61803264)。

Abstract: The traditional attention-based neural network model can not effectively pay attention to aspect features and sentiment information, and context words of different distances or different directions have different contributions to the sentiment polarity assessment of aspect words. Aiming at these problems, Location Weight and Attention-Over-Attention Long-short Term Memory (LWAOA-LSTM) model was proposed. Firstly, the location weight information was added to the word vectors. Then Long-Short Term Memory (LSTM) network was used to simultaneously model aspects and sentences to generate aspect representation and sentence representation, and the aspect and sentence representations were learned simultaneously through attention-over-attention module to obtain the interactions from the aspect to the text and from the text to the aspect, and the important part of the sentence was automatically paid attention to. Finally, the experiments were carried out on different thematic datasets of attractions, catering and accommodation, and the accuracy of the aspect level sentiment analysis by the model was verified. Experimental results show that the accuracy of the model on the datasets of attractions, catering and accommodation is 78.3%, 80.6% and 82.1% respectively, and LWAOA-LSTM has better performance than traditional LSTM network model.

Key words: deep learning, aspect level sentiment classification, location-weighted word vector, attention-over-attention, Long-Short Term Memory (LSTM) network

摘要: 针对传统的基于注意力机制的神经网络模型不能对方面特征和情感信息进行有效关注,以及不同距离或不同方向的上下文词对方面词的情感极性判断有不同的贡献等问题,提出一种融合位置权重的基于注意力交叉注意力的长短期记忆方面情感分析模型(LWAOA-LSTM)。首先,为词向量加入位置权重信息;然后,使用长短期记忆(LSTM)网络同时对方面和句子进行建模以生成方面表示和句子表示,同时通过注意力交叉注意力模块共同学习方面和句子的表示以获得方面到文本和文本到方面的交互关注,并自动关注句子中的重要部分;最后,在景点、餐饮、住宿不同主题数据集上进行实验,验证了该模型对方面情感分析的准确性。实验结果表明,所提模型在景点、餐饮、住宿主题数据集上的准确率分别达到78.3%、80.6%和82.1%,取得了比传统LSTM网络模型更好的效果。

关键词: 深度学习, 方面情感分析, 位置加权词向量, 注意力交叉注意力, 长短期记忆网络

CLC Number: