《计算机应用》唯一官方网站 ›› 2025, Vol. 45 ›› Issue (7): 2211-2220.DOI: 10.11772/j.issn.1001-9081.2024070948

• 人工智能 • 上一篇    下一篇

基于知识感知与交互的多视图蒸馏推荐算法

张悦岚, 苏静(), 赵航宇, 杨白利   

  1. 天津科技大学 人工智能学院,天津 300457
  • 收稿日期:2024-07-08 修回日期:2024-09-19 接受日期:2024-09-26 发布日期:2025-07-10 出版日期:2025-07-10
  • 通讯作者: 苏静
  • 作者简介:张悦岚(2000—),女,贵州黔西人,硕士研究生,CCF会员,主要研究方向:推荐算法
    苏静(1979—),女,北京人,教授,博士,CCF会员,主要研究方向:智能信息处理 sujing@tust.edu.cn
    赵航宇(1999—),男,河南禹州人,硕士研究生,主要研究方向:模式识别
    杨白利(2000—),女,河北武安人,硕士研究生,主要研究方向:卷积神经网络。
  • 基金资助:
    国家自然科学基金资助项目(62377036);天津市科技计划项目(22KPXMRC00210)

Multi-view knowledge-aware and interactive distillation recommendation algorithm

Yuelan ZHANG, Jing SU(), Hangyu ZHAO, Baili YANG   

  1. College of Artificial Intelligence,Tianjin University of Science and Technology,Tianjin 300457,China
  • Received:2024-07-08 Revised:2024-09-19 Accepted:2024-09-26 Online:2025-07-10 Published:2025-07-10
  • Contact: Jing SU
  • About author:ZHANG Yuelan, born in 2000, M. S. candidate. Her research interests include recommendation algorithms.
    SU Jing, born in 1979, Ph. D., professor. Her research interests include intelligent information processing.
    ZHAO Hangyu, born in 1999, M. S. candidate. His research interests include pattern recognition.
    YANG Baili, born in 2000, M. S. candidate. Her research interests include convolutional neural networks.
  • Supported by:
    National Natural Science Foundation of China(62377036);Tianjin Science and Technology Project(22KPXMRC00210)

摘要:

目前,基于协同过滤的图神经网络(GNN)推荐系统存在严重的数据稀疏和冷启动问题。很多相关算法引入项目的外部知识进行补充性扩展使这些问题得以缓解,然而这些算法忽略了稀疏协同信号和冗余补充内容直接结合所导致的信息利用严重不平衡以及不同数据之间的共享传递问题。因此,设计一种基于知识感知与交互的多视图蒸馏推荐算法(MKDRec)。首先,针对协同数据的稀疏性,对交互图采用随机丢弃以增强形成的协同视图,再将该视图下的节点表征进行邻域对比学习;其次,关于知识冗余问题,对知识视图中的每种关系的边进行编码,并基于头尾实体和连接关系重构项目知识视图,使信息得到充分利用;最后,基于项目与实体间的等价关系构建具有远程连接的关联视图。至此,对3个视图以不同卷积聚合方式学习图节点表征来提取多种用户与项目的信息,并得出多个用户与项目的嵌入表示。此外,将两两视图的节点特征向量进行知识蒸馏融合以实现信息的共享和传递。MKDRec在数据集Book-Crossing、MovieLens-1M和Last.FM上的实验结果显示,相较于最好的基线方法结果,MKDRec的曲线下面积(AUC)分别提升了2.13%、1.07%和3.44%,而F1分数分别提升了3.56%、1.14%和4.46%。

关键词: 推荐算法, 图神经网络, 多视图融合, 知识蒸馏, 局部增强

Abstract:

Currently, collaborative filtering-based Graph Neural Network (GNN) recommendation systems face data sparsity and cold start issues. Many related algorithms introduce external knowledge of items for supplementary expansion to alleviate these issues, but these algorithms ignore the severe information utilization imbalance caused by direct fusion of sparse collaborative signals and redundant supplementary parts, as well as the problems of information sharing and propagation among different data. Therefore, a Multi-view Knowledge-aware and interactive Distillation Recommendation algorithm (MKDRec) was proposed. Firstly, to tackle data sparsity, the formed collaborative view was enhanced through random dropout, and then neighborhood contrastive learning was applied to node representations in this view. Secondly, regarding to knowledge redundancy problem, each relation type of edge in the knowledge view was encoded, and an items’ knowledge view was reconstructed on the basis of the head and tail entities as well as connecting relations to fully utilize the information. Finally, an associated view with remote connections was constructed on the basis of the equivalence relations between items and entities. With all the above, graph node representations were learned by different convolutional aggregation methods on the three views to extract multiple types of information for users and items, and embedded representations of multiple users and items were obtained. Besides, knowledge distillation and fusion of node feature vectors in pairwise views were performed to realize information sharing and propagation. Experimental results on Book-Crossing, MovieLens-1M, and Last.FM datasets show that compared to the best results among the baseline methods, MKDRec’s AUC (Area Under Curve) are improved by 2.13%, 1.07%, and 3.44%, respectively, and MKDRec’s F1-scores are improved by 3.56%, 1.14%, and 4.46%, respectively.

Key words: recommendation algorithm, Graph Neural Network (GNN), multi-view fusion, knowledge distillation, local enhancement

中图分类号: