Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Multi-label text classification method based on contrastive learning enhanced dual-attention mechanism
Mingfeng YU, Yongbin QIN, Ruizhang HUANG, Yanping CHEN, Chuan LIN
Journal of Computer Applications    2025, 45 (6): 1732-1740.   DOI: 10.11772/j.issn.1001-9081.2024070909
Abstract28)   HTML0)    PDF (1801KB)(19)       Save

To address the problem that the existing methods based on attention mechanism are difficult to capture complex dependencies among texts, a multi-label text classification method based on contrastive learning enhanced dual-attention mechanism was proposed. Firstly, text representations based on self-attention and label attention were learned respectively, and the two were fused to obtain a more comprehensive text representation for capturing structural features of the text and semantic associations among the text and labels. Then, a multi-label contrastive learning objective was given to supervise the learning of text representations by label-guided text similarity, thereby capturing complex dependencies among the texts at topic, content, and structural levels. Finally, a feedforward neural network was used as a classifier for text classification. Experimental results demonstrate that compared with LDGN (Label-specific Dual Graph neural Network), the proposed method improves the normalized Discounted Cumulative Gain at top-5 (nDCG@5) value by 1.81 and 0.86 percentage points, respectively, on EUR-Lex (European Union Law Document) dataset and Reuters-21578 dataset, and achieves competitive results on AAPD (Arxiv Academic Paper Dataset) dataset and RCV1 (Reuters Corpus Volume Ⅰ) dataset. It can be seen that this method can capture the complex dependencies among texts at topic, content, and structural levels effectively, resulting in good performance in multi-label text classification tasks.

Table and Figures | Reference | Related Articles | Metrics