Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Sentiment classification model of psychological counseling text based on attention over attention mechanism
Yuqing WANG, Guangli ZHU, Wenjie DUAN, Shuyu LI, Ruotong ZHOU
Journal of Computer Applications    2024, 44 (8): 2393-2399.   DOI: 10.11772/j.issn.1001-9081.2023081168
Abstract269)   HTML6)    PDF (1474KB)(425)       Save

Sentiment classification in psychological counseling scenes aims to obtain the sentiment polarity of the inquirer’s utterance, which can provide support for establishing psychological counseling Artificial Intelligence (AI) assistants. Existing methods obtain the sentiment polarity of text through contextual information, failing to consider the sentiment transmission between the current sentence and the forward neighbor sentences in the dialogue record. To address the issue, a model for sentiment classification of psychological counseling text was proposed based on Attention Over Attention (AOA) mechanism. Historical sentiment words were assigned weights by temporal sequence, which improved the accuracy of sentiment classification for psychological counseling text. In a dialogue, historical sentiment word sequences of both sides were extracted by constructed sentiment lexicon of mental health. Subsequently, the current sentence and two sequences of historical sentiment words were input into the Bidirectional Long Short-Term Memory (BiLSTM) network to get corresponding feature vectors. The Ebbinghaus forgetting curve was used to allocate internal weights to the sequences of historical sentiment words. Both inertia features and interaction features were captured by AOA mechanism. Then, the above two features along with the text features were input into the classification layer, calculating the probability of sentiment polarity. Experimental results on public dataset Emotional First Aid Dataset show that the proposed model improves F1 value by 1.55% compared with Capsule network and Directional Graph Convolutional Network (Caps-DGCN) model. Hence the proposed model can effectively improve the sentiment classification effect of psychological counseling text.

Table and Figures | Reference | Related Articles | Metrics
Complex causal relationship extraction based on prompt enhancement and bi-graph attention network
Jinke DENG, Wenjie DUAN, Shunxiang ZHANG, Yuqing WANG, Shuyu LI, Jiawei LI
Journal of Computer Applications    2024, 44 (10): 3081-3089.   DOI: 10.11772/j.issn.1001-9081.2023101486
Abstract169)   HTML1)    PDF (2643KB)(66)       Save

A complex causal relationship extraction model based on prompt enhancement and Bi-Graph ATtention network (BiGAT) — PE-BiGAT (Prompt Enhancement and Bi-Graph Attention Network) was proposed to address the issues of insufficient external information and information transmission forgetting caused by the high density and long sentence patterns of complex causal sentences. Firstly, the result entities from the sentence were extracted and combined with the prompt learning template to form the prompt information, and the prompt information was enhanced through an external knowledge base. Then, the prompt information was input into the BiGAT, the attention layer was combined with syntax and semantic dependency graphs, and the biaffine attention mechanism was used to alleviate feature overlapping and enhance the model’s perception of relational features. Finally, all causal entities in the sentence were predicted iteratively by the classifier, and all causal pairs in the sentence were analyzed through a scoring function. Experimental results on SemEval-2010 task 8 and AltLex datasets show that compared with RPA-GCN (Relationship Position and Attention?Graph Convolutional Network), the proposed model improves the F1 score by 1.65 percentage points, with 2.16 and 4.77 percentage points improvements in chain causal and multi-causal sentences, which confirming that the proposed model has an advantage in dealing with complex causal sentences.

Table and Figures | Reference | Related Articles | Metrics