Journal of Computer Applications
Next Articles
Received:
Revised:
Online:
Published:
Contact:
黄华娟,方永康,韦修喜
通讯作者:
基金资助:
Abstract: Heterogeneous Graph Neural Network (HGNN) have made significant progress in various real-world applications, particularly in handling complex graph structures with multiple node and edge types. However, existing HGNN methods still face challenges, primarily relying on manually designed meta-paths and struggling to flexibly model multi-relational semantics. To address these issues, we propose a Multi-Head Relation-Transformed Heterogeneous Graph Neural Network (MHRTHGN) model. First, we construct independent relation transformation matrices and a multi-head attention mechanism for different relations to accurately capture multi-relation information. Then, a relation-aware attention mechanism is introduced to integrate features at both the node and relation levels, enhancing the discriminability of node representations. Finally, a feedforward network is employed to further improve the semantic expression capability. Comparative experiments on the ACM, IMDB, and DBLP datasets demonstrate that MHRTHGN achieves an average accuracy improvement of 3%–5% in node classification and clustering tasks, compared to representative models such as HAN and HGT, validating the effectiveness and superiority of the proposed model.
Key words: heterogeneous graph, graph neural network, graph representation learning, multi-head attention mechanism, relation-aware
摘要: 异构图神经网络(HGNN)近年来在多种实际应用中取得了显著进展,尤其是在处理具有多种节点和边类型的复杂图结构时。然而,现有的HGNN方法仍面临一些挑战,主要包括过度依赖人工设计的元路径、难以灵活建模多关系语义等问题。为了解决这些问题,本文提出了一种基于多头关系转化的异构图神经网络(MHRTHGN)模型。首先,为不同关系构建独立的关系变换矩阵和多头注意力机制,实现对多关系信息的精细刻画;然后,引入关系感知注意力机制,在节点级和关系级两层面融合不同关系的特征表示,提升节点表示的区分性;最后,通过前馈网络进一步增强语义表达能力。分别在 ACM、IMDB、DBLP 数据集上进行了对比实验,结果表明,相较于 HAN、HGT 等代表性模型,MHRTHGN 在节点分类与节点聚类任务上的准确率平均提升 3%~5%,验证了所提模型的有效性与优越性。
关键词: 异构图, 图神经网络, 图表示学习, 多头注意力机制, 关系感知
CLC Number:
TP18
黄华娟 方永康 韦修喜. 基于多头关系转化的异构图神经网络[J]. 《计算机应用》唯一官方网站, DOI: 10.11772/j.issn.1001-9081.2025081065.
0 / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: https://www.joca.cn/EN/10.11772/j.issn.1001-9081.2025081065