Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Chinese entity and relation extraction model based on parallel heterogeneous graph and sequential attention mechanism
Dianhui MAO, Xuebo LI, Junling LIU, Denghui ZHANG, Wenjing YAN
Journal of Computer Applications    2024, 44 (7): 2018-2025.   DOI: 10.11772/j.issn.1001-9081.2023071051
Abstract368)   HTML20)    PDF (2387KB)(1234)       Save

In recent years, with the rapid development of deep learning technology, entity and relation extraction has made remarkable progress in many fields. However, due to complex syntactic structures and semantic relationships of Chinese text, there are still many challenges in Chinese entity and relation extraction. Among them, the problem of overlapping triple in Chinese text is one of the important challenges. A Hybrid Neural Network Entity and Relation Joint Extraction (HNNERJE) model was proposed in this article to address the issue of overlapping triple in Chinese text. HNNERJE model fused sequence attention mechanism and heterogeneous graph attention mechanism in a parallel manner, and combined them with a gated fusion strategy, so that it could capture both word order information and entity association information of Chinese text, and adaptively adjusted the output of subject and object markers, effectively solving the overlapping triple issue. Moreover, adversarial training algorithm was introduced to improve the model’s adaptability in processing unseen samples and noise. Finally, SHapley Additive exPlanations (SHAP) method was adopted to explain and analyze HNNERJE model, which effectively revealed key features in extracting entities and relations. HNNERJE model achieved high performance on NYT, WebNLG, CMeIE, and DuIE datasets with F1 score of 92.17%, 93.42%, 47.40%, and 67.98%, respectively. The experimental results indicate that HNNERJE model can transform unstructured text data into structured knowledge representations and effectively extract valuable information.

Table and Figures | Reference | Related Articles | Metrics