Journal of Computer Applications

    Next Articles

Multi-scale Decorrelation Graph Convolutional Network

  

  • Received:2024-07-08 Revised:2024-10-14 Online:2024-11-19 Published:2024-11-19
  • Supported by:
    the National Natural Science Foundation of China

多尺度去相关的图卷积网络

陈丹阳1,张长伦2   

  1. 1. 北京建筑大学
    2. 北京建筑大学理学院
  • 通讯作者: 陈丹阳
  • 基金资助:
    国家自然科学基金

Abstract: Abstract: In view of the performance degradation of depth graph neural networks and excessive correlation of feature matrices, a multi-scale de-correlation graph convolutional network (Multi-Deprop) was proposed. Firstly, in the feature propagation operation, multi-scale de-correlation parameters are used to make the network maintain high de-correlation of low-level network and weak de-correlation of high-level network in the propagation process, so as to adapt to the needs of feature processing at different levels. Secondly, in the feature transformation operation, orthogonal regularization and maximum information loss are introduced to help alleviate over-correlation problems, and the orthogonal regularization loss maintains feature independence. Maximum informationization maximizes the mutual information between inputs and representations to reduce the redundancy of feature information. Finally, the comparison experiment with four benchmark models on the seven node classification data set shows that Multi-Deprop can achieve better accuracy of node classification in 2-32 layers, especially in high layer numbers, which alleviates the performance degradation of deep network. For the correlation analysis of the feature matrix, the Multi-Deprop correlation in the Cora dataset is about 0.4, which greatly alleviates the over-correlation problem. Ablation experiments also show that the improvement of the two modules can improve the performance of the model to some extent.

Key words: deep graph neural network, over-correlation, L2 regularization, maximum informatization, Multi-scale decorrelation

摘要: 针对深度图神经网络性能下降,其特征矩阵过度相关,提出了一种多尺度去相关的图卷积网络(Multi-Deprop)。首先在特征传播操作中,借助多尺度去相关参数,使得网络在传播过程中维持低层网络高去相关,高层网络弱去相关,以适应不同层级特征处理的需求,其次,在特征变换操作中为辅助缓解过相关问题,引入正交正则化与最大信息化损失,正交正则化损失保持特征独立性,最大信息化最大化输入和表示之间的互信息,以降低特征信息的冗余。最终在7个节点分类的数据集上与4个基准模型进行对比实验,Multi-Deprop在2-32层数上大多能取得更优的节点分类准确率,尤其是在高层数中,缓解了深层网络性能下降问题。对于特征矩阵的相关性分析,在Cora数据集中Multi-Deprop相关性在0.4左右,过相关问题得到了极大地缓解。消融实验也表明两个模块的改进均对模型性能有一定的提升。

关键词: 深度图神经网络, 过度相关, L2正则化, 最大信息化, 多尺度去相关

CLC Number: