Journal of Computer Applications ›› 2023, Vol. 43 ›› Issue (9): 2753-2759.DOI: 10.11772/j.issn.1001-9081.2022091347

• Artificial intelligence • Previous Articles     Next Articles

Aspect-based sentiment analysis method with integrating prompt knowledge

Xinyue ZHANG, Rong LIU(), Chiyu WEI, Ke FANG   

  1. College of Physical Science and Technology,Central China Normal University,Wuhan Hubei 430079,China
  • Received:2022-09-09 Revised:2022-11-11 Accepted:2022-11-15 Online:2023-02-14 Published:2023-09-10
  • Contact: Rong LIU
  • About author:ZHANG Xinyue, born in 1997, M. S. candidate. Her research interests include pattern recognition, aspect-based sentiment analysis.
    WEI Chiyu, born in 1998, M. S. candidate. His research interests include deep learning, object detection.
    FANG Ke, born in 1999, M. S. candidate. His research interests include deep learning, object detection.
  • Supported by:
    Key Project of National Social Science Foundation of China(22ATQ004);Cross Science Research Project of Central China Normal University(CCNU22JC033)

融合提示知识的方面级情感分析方法

张心月, 刘蓉(), 魏驰宇, 方可   

  1. 华中师范大学 物理科学与技术学院,武汉 430079
  • 通讯作者: 刘蓉
  • 作者简介:张心月(1997—),女,河南周口人,硕士研究生,主要研究方向:模式识别、方面级情感分析
    魏驰宇(1998—),男,河南周口人,硕士研究生,主要研究方向:深度学习、目标检测
    方可(1999—),男,河南周口人,硕士研究生,主要研究方向:深度学习、目标检测。
  • 基金资助:
    国家社会科学基金重点项目(22ATQ004);华中师范大学交叉科学研究项目(CCNU22JC033)

Abstract:

Aspect-based sentiment analysis based on pre-trained models generally uses end-to-end frameworks, has the problems of inconsistency between the upstream and downstream tasks, and is difficult to model the relationships between aspect words and context effectively. To address these problems, an aspect-based sentiment analysis method integrating prompt knowledge was proposed. First, in order to capture the semantic relation between aspect words and context effectively and enhance the model’s perception ability for sentiment analysis tasks, based on the Prompt mechanism, a prompt text was constructed and spliced with the original sentence and aspect words, and the obtained results were used as the input of the pre-trained model Bidirectional Encoder Representations from Transformers (BERT). Then, a sentimental label vocabulary was built and integrated into the sentimental verbalizer layer, so as to reduce search space of the model, make the pre-trained model obtain rich semantic knowledge in the label vocabulary, and improve the learning ability of the model. Experimental results on Restaurant and Laptop field datasets of SemEval2014 Task4 dataset as well as ChnSentiCorp dataset show that the F1-score of the proposed method reaches 77.42%, 75.20% and 94.89% respectively, which is increased by 0.65 to 10.71, 1.02 to 9.58 and 0.83 to 6.40 percentage points compared with the mainstream aspect-based sentiment analysis methods such as Glove-TextCNN and P-tuning. The above verifies the effectiveness of the proposed method.

Key words: Natural Language Processing (NLP), aspect-based sentiment analysis, pre-trained model, prompt text, sentimental verbalizer

摘要:

针对基于预训练模型的方面级情感分析普遍使用端到端框架,存在上下游阶段任务不一致、难以有效建模方面词和上下文之间关系的问题,提出一种融合提示知识的方面级情感分析方法。首先基于Prompt机制构造提示文本,将该提示文本与原始句子和方面词进行拼接,并把得到的结果作为预训练模型BERT(Bidirectional Encoder Representation from Transformers)的输入,以有效捕获方面词和上下文之间的语义联系,同时提升模型对情感分析任务的感知能力;然后构建情感标签词表,并将它融入情感标签词映射层,以减小模型的搜索空间,使预训练模型获取标签词表中丰富的语义知识,并增强模型的学习能力。实验结果表明,所提方法在SemEval2014 Task4数据集的Restaurant、Laptop两个领域数据集和ChnSentiCorp数据集上的F1值分别达到了77.42%、75.20%、94.89%,与Glove-TextCNN、P-tuning等主流方面级情感分析方法相比提高了0.65~10.71、1.02~9.58与0.83~6.40个百分点,验证了所提方法对方面级情感分析的有效性。

关键词: 自然语言处理, 方面级情感分析, 预训练模型, 提示文本, 情感极性映射

CLC Number: