Journal of Computer Applications ›› 2025, Vol. 45 ›› Issue (7): 2180-2187.DOI: 10.11772/j.issn.1001-9081.2024070951

• Artificial intelligence • Previous Articles     Next Articles

Multi-scale decorrelation graph convolutional network model

Danyang CHEN, Changlun ZHANG()   

  1. School of Science,Beijing University of Civil Engineering and Architecture,Beijing 102616,China
  • Received:2024-07-05 Revised:2024-10-14 Accepted:2024-10-16 Online:2025-07-10 Published:2025-07-10
  • Contact: Changlun ZHANG
  • About author:CHEN Danyang, born in 1999, M. S. candidate. Her research interests include graph neural network.
    ZHANG Changlun, born in 1972, Ph. D., associate professor. His research interests include machine learning, deep learning, big data modeling analytics.
  • Supported by:
    National Natural Science Foundation of China(62072024)

多尺度去相关的图卷积网络模型

陈丹阳, 张长伦()   

  1. 北京建筑大学 理学院,北京 102616
  • 通讯作者: 张长伦
  • 作者简介:陈丹阳(1999—),女,山东临沂人,硕士研究生,主要研究方向:图神经网络
    张长伦(1972—),男,山东济宁人,副教授,博士,主要研究方向:机器学习、深度学习、大数据建模分析。zclun@bucea.edu.cn
  • 基金资助:
    国家自然科学基金资助项目(62072024)

Abstract:

Deep Graph Neural Networks (GNNs) aim to capture both local and global features in complex networks, thereby alleviating the bottleneck in information propagation in graph-structured data. However, current deep GNN models often face the problem of feature over-correlation. Therefore, a Multi-scale Decorrelation graph convolutional network (Multi-Deprop) model was proposed. The model includes two operations: feature propagation and feature transformation. In feature propagation operation, multi-scale de-correlation parameters were introduced to maintain high de-correlation in lower network layers and weak de-correlation in higher network layers, thereby adapting to the needs of different hierarchical feature processing. In feature transformation operation, orthogonal regularization and maximal informatization loss were introduced, and orthogonal regularization was used to maintain feature independence and maximal informatization was used to maximize mutual information between the input and representation, thereby reducing feature information redundancy. Finally, comparison experiments were conducted on seven node classification datasets among the proposed model and four benchmark models. Experimental results show that the Multi-Deprop model achieves better node classification accuracy in most cases of models with 2 to 32 layers. Particularly on Cora dataset, the Multi-Deprop model has the accuracy of models with 4 to 32 layers improved by 0.80% to 13.28% compared to the benchmark model Deprop, which means the performance degradation problem in deep networks is solved by the proposed model in certain degree. In feature matrix correlation analysis, the feature matrix obtained using the Multi-Deprop model on Cora dataset has a correlation of 0.40, indicating weak correlation, demonstrating that the Multi-Deprop model alleviates the over-correlation issue significantly. The results of ablation studies and visualization experiments show that improvements in both operations contribute to enhancement of model performance. It can be seen that Multi-Deprop model reduces feature redundancy in deep networks significantly while ensuring high classification accuracy, and has strong generalization ability and practical value.

Key words: deep Graph Neural Network (GNN), over-correlation, L2 regularization, maximal informatization, multi-scale decorrelation

摘要:

深度图神经网络(GNN)旨在捕捉复杂网络中的局部和全局特征,从而缓解图结构数据中的信息传递瓶颈。然而,现有的深度GNN模型常常面临特征过度相关的问题。因此,提出一种多尺度去相关图卷积网络(Multi-Deprop)模型。该模型包含特征传播和特征变换两种操作。在特征传播操作中,引入多尺度去相关参数,以使网络在传播过程中维持低层网络的高去相关性以及高层网络的弱去相关性,从而适应不同层级特征处理的需求。在特征变换操作中,引入正交正则化与最大信息化损失,其中:正交正则化损失保持特征独立性,最大信息化则最大化输入和表示之间的互信息,从而降低特征信息的冗余。最后,在7个节点分类的数据集上把所提模型与4个基准模型进行对比实验。实验结果表明, Multi-Deprop模型在大多数的2~32层的模型中能取得更优的节点分类准确率。特别是在Cora数据集上, Multi-Deprop模型的4~32层网络模型准确率相较于基准模型Deprop提升了0.80%~13.28%,即Multi-Deprop模型一定程度上解决了深层网络性能下降的问题。而在特征矩阵的相关性分析上,在Cora数据集上使用Multi-Deprop深层模型获得的特征矩阵相关性在0.40左右,即特征矩阵属于弱相关,说明Multi-Deprop模型极大地缓解了过相关现象。消融实验及损失可视化实验的结果表明,两个操作的改进均对模型性能有一定的提升作用。可见, Multi-Deprop模型能在保证高分类准确率的同时,显著降低深度网络中的特征冗余现象,具有较好的泛化性能和实用性。

关键词: 深度图神经网络, 过度相关, L2正则化, 最大信息化, 多尺度去相关

CLC Number: