Journal of Computer Applications ›› 2025, Vol. 45 ›› Issue (11): 3529-3539.DOI: 10.11772/j.issn.1001-9081.2024111657
• Artificial intelligence • Previous Articles
Jinghua ZHAO1, Zhu ZHANG1, Xiting LYU1, Huidan LIN2(
)
Received:2024-11-22
Revised:2025-03-10
Accepted:2025-03-18
Online:2025-04-02
Published:2025-11-10
Contact:
Huidan LIN
About author:ZHAO Jinghua, born in 1984, Ph. D.,associate professor. Her research interests include popularity prediction, interactive innovation.Supported by:通讯作者:
林慧丹
作者简介:赵敬华(1984—),女,山东冠县人,副教授,博士,主要研究方向:流行度预测、互动创新基金资助:CLC Number:
Jinghua ZHAO, Zhu ZHANG, Xiting LYU, Huidan LIN. Multiscale information diffusion prediction model based on hypergraph neural network[J]. Journal of Computer Applications, 2025, 45(11): 3529-3539.
赵敬华, 张柱, 吕锡婷, 林慧丹. 基于超图神经网络的多尺度信息传播预测模型[J]. 《计算机应用》唯一官方网站, 2025, 45(11): 3529-3539.
Add to citation manager EndNote|Ris|BibTeX
URL: https://www.joca.cn/EN/10.11772/j.issn.1001-9081.2024111657
| 数据集 | #Users | #Links | #Cascades | Avg.Length |
|---|---|---|---|---|
| 12 627 | 309 631 | 3 442 | 32.60 | |
| Douban | 23 123 | 348 280 | 10 602 | 27.14 |
| Android | 9 958 | 48 573 | 679 | 33.30 |
Tab. 1 Statistical details of experimental datasets
| 数据集 | #Users | #Links | #Cascades | Avg.Length |
|---|---|---|---|---|
| 12 627 | 309 631 | 3 442 | 32.60 | |
| Douban | 23 123 | 348 280 | 10 602 | 27.14 |
| Android | 9 958 | 48 573 | 679 | 33.30 |
| Hits@10 | mAP@10 | MSLE | 训练时间/h | |
|---|---|---|---|---|
| 6 | 0.302 | 0.217 | 0.923 | 8.2 |
| 12 | 0.308 | 0.224 | 0.905 | 6.3 |
| 24 | 0.318 | 0.224 | 0.896 | 4.5 |
| 48 | 0.311 | 0.221 | 0.912 | 3.9 |
Tab. 2 Impact of time window length on model performance and training efficiency
| Hits@10 | mAP@10 | MSLE | 训练时间/h | |
|---|---|---|---|---|
| 6 | 0.302 | 0.217 | 0.923 | 8.2 |
| 12 | 0.308 | 0.224 | 0.905 | 6.3 |
| 24 | 0.318 | 0.224 | 0.896 | 4.5 |
| 48 | 0.311 | 0.221 | 0.912 | 3.9 |
| 参数名 | 参数值 | 参数名 | 参数值 |
|---|---|---|---|
| Batch Size | 16 | d_model | 64 |
| d_pos | 8 | warmup epochs | 10 |
| Num Epoch | 50 | embed_dim | 64 |
| Dropout Rate | 0.1 | k | 5 |
Tab. 3 Parameter settings
| 参数名 | 参数值 | 参数名 | 参数值 |
|---|---|---|---|
| Batch Size | 16 | d_model | 64 |
| d_pos | 8 | warmup epochs | 10 |
| Num Epoch | 50 | embed_dim | 64 |
| Dropout Rate | 0.1 | k | 5 |
| 数据集 | 模型 | Hits@10 | Hits@50 | Hits@100 | mAP@10 | mAP@50 | mAP@100 |
|---|---|---|---|---|---|---|---|
| NDM | 0.215 2 | 0.322 3 | 0.383 1 | 0.143 0 | 0.148 0 | 0.148 9 | |
| SNIDSA | 0.233 7 | 0.354 6 | 0.434 9 | 0.148 4 | 0.154 0 | 0.155 1 | |
| GraphSAGE | 0.257 7 | 0.337 8 | 0.453 7 | 0.156 3 | 0.166 5 | 0.170 2 | |
| FOREST | 0.261 8 | 0.409 5 | 0.503 9 | 0.172 1 | 0.178 8 | 0.180 2 | |
| TGAT | 0.248 0 | 0.360 7 | 0.478 7 | 0.164 0 | 0.170 4 | 0.178 0 | |
| MIDHGNN | 0.318 4 | 0.443 7 | 0.534 5 | 0.227 9 | 0.233 1 | 0.236 3 | |
| Douban | NDM | 0.103 1 | 0.188 7 | 0.240 2 | 0.055 4 | 0.059 3 | 0.060 |
| SNIDSA | 0.118 1 | 0.219 1 | 0.283 7 | 0.063 6 | 0.068 1 | 0.069 1 | |
| GraphSAGE | 0.123 3 | 0.190 3 | 0.223 9 | 0.053 3 | 0.057 2 | 0.060 3 | |
| FOREST | 0.141 6 | 0.247 9 | 0.312 5 | 0.078 9 | 0.083 8 | 0.084 7 | |
| TGAT | 0.137 8 | 0.184 8 | 0.265 3 | 0.067 2 | 0.071 8 | 0.076 3 | |
| MIDHGNN | 0.160 2 | 0.275 8 | 0.345 4 | 0.091 1 | 0.096 0 | 0.097 0 | |
| Android | NDM | 0.017 0 | 0.042 3 | 0.055 5 | 0.005 9 | 0.007 0 | 0.007 2 |
| GraphSAGE | 0.065 5 | 0.094 3 | 0.118 2 | 0.052 2 | 0.054 1 | 0.057 3 | |
| FOREST | 0.086 6 | 0.173 9 | 0.231 4 | 0.062 8 | 0.066 7 | 0.067 5 | |
| TGAT | 0.071 2 | 0.166 2 | 0.196 4 | 0.061 1 | 0.063 2 | 0.066 1 | |
| MIDHGNN | 0.096 9 | 0.189 6 | 0.250 6 | 0.072 3 | 0.075 3 | 0.075 4 |
Tab. 4 Experimental results of micro-scale prediction
| 数据集 | 模型 | Hits@10 | Hits@50 | Hits@100 | mAP@10 | mAP@50 | mAP@100 |
|---|---|---|---|---|---|---|---|
| NDM | 0.215 2 | 0.322 3 | 0.383 1 | 0.143 0 | 0.148 0 | 0.148 9 | |
| SNIDSA | 0.233 7 | 0.354 6 | 0.434 9 | 0.148 4 | 0.154 0 | 0.155 1 | |
| GraphSAGE | 0.257 7 | 0.337 8 | 0.453 7 | 0.156 3 | 0.166 5 | 0.170 2 | |
| FOREST | 0.261 8 | 0.409 5 | 0.503 9 | 0.172 1 | 0.178 8 | 0.180 2 | |
| TGAT | 0.248 0 | 0.360 7 | 0.478 7 | 0.164 0 | 0.170 4 | 0.178 0 | |
| MIDHGNN | 0.318 4 | 0.443 7 | 0.534 5 | 0.227 9 | 0.233 1 | 0.236 3 | |
| Douban | NDM | 0.103 1 | 0.188 7 | 0.240 2 | 0.055 4 | 0.059 3 | 0.060 |
| SNIDSA | 0.118 1 | 0.219 1 | 0.283 7 | 0.063 6 | 0.068 1 | 0.069 1 | |
| GraphSAGE | 0.123 3 | 0.190 3 | 0.223 9 | 0.053 3 | 0.057 2 | 0.060 3 | |
| FOREST | 0.141 6 | 0.247 9 | 0.312 5 | 0.078 9 | 0.083 8 | 0.084 7 | |
| TGAT | 0.137 8 | 0.184 8 | 0.265 3 | 0.067 2 | 0.071 8 | 0.076 3 | |
| MIDHGNN | 0.160 2 | 0.275 8 | 0.345 4 | 0.091 1 | 0.096 0 | 0.097 0 | |
| Android | NDM | 0.017 0 | 0.042 3 | 0.055 5 | 0.005 9 | 0.007 0 | 0.007 2 |
| GraphSAGE | 0.065 5 | 0.094 3 | 0.118 2 | 0.052 2 | 0.054 1 | 0.057 3 | |
| FOREST | 0.086 6 | 0.173 9 | 0.231 4 | 0.062 8 | 0.066 7 | 0.067 5 | |
| TGAT | 0.071 2 | 0.166 2 | 0.196 4 | 0.061 1 | 0.063 2 | 0.066 1 | |
| MIDHGNN | 0.096 9 | 0.189 6 | 0.250 6 | 0.072 3 | 0.075 3 | 0.075 4 |
| 数据集 | 模型 | MSLE |
|---|---|---|
| DeepCas | 2.261 | |
| DeepHawkes | 2.411 | |
| FOREST | 0.975 | |
| TGAT | 1.164 | |
| MIDHGNN | 0.896 | |
| Douban | DeepCas | 2.122 |
| DeepHawkes | 1.725 | |
| FOREST | 0.825 | |
| TGAT | 0.927 | |
| MIDHGNN | 0.721 | |
| Android | DeepCas | 2.122 |
| DeepHawkes | 1.971 | |
| FOREST | 0.556 | |
| TGAT | 0.741 | |
| MIDHGNN | 0.538 |
Tab. 5 Experimental results of macro-scale prediction
| 数据集 | 模型 | MSLE |
|---|---|---|
| DeepCas | 2.261 | |
| DeepHawkes | 2.411 | |
| FOREST | 0.975 | |
| TGAT | 1.164 | |
| MIDHGNN | 0.896 | |
| Douban | DeepCas | 2.122 |
| DeepHawkes | 1.725 | |
| FOREST | 0.825 | |
| TGAT | 0.927 | |
| MIDHGNN | 0.721 | |
| Android | DeepCas | 2.122 |
| DeepHawkes | 1.971 | |
| FOREST | 0.556 | |
| TGAT | 0.741 | |
| MIDHGNN | 0.538 |
| 模型 | Hits@100 | mAP@100 | MSLE |
|---|---|---|---|
| -GCN | 0.515 4 | 0.220 2 | 0.935 |
| -HGNN | 0.503 2 | 0.214 0 | 1.021 |
| -RL | 0.520 2 | 0.223 0 | — |
| -HGNN+RL | 0.501 1 | 0.207 4 | — |
| MIDHGNN | 0.534 5 | 0.236 3 | 0.896 |
Tab. 6 Ablation experiment results on Twitter dataset
| 模型 | Hits@100 | mAP@100 | MSLE |
|---|---|---|---|
| -GCN | 0.515 4 | 0.220 2 | 0.935 |
| -HGNN | 0.503 2 | 0.214 0 | 1.021 |
| -RL | 0.520 2 | 0.223 0 | — |
| -HGNN+RL | 0.501 1 | 0.207 4 | — |
| MIDHGNN | 0.534 5 | 0.236 3 | 0.896 |
| 用户节点嵌入维度 | MSLE | 用户节点嵌入维度 | MSLE |
|---|---|---|---|
| 16 | 1.046 | 64 | 0.896 |
| 32 | 0.967 | 128 | 0.986 |
Tab. 7 Impact of user node embedding dimension on macro prediction index value
| 用户节点嵌入维度 | MSLE | 用户节点嵌入维度 | MSLE |
|---|---|---|---|
| 16 | 1.046 | 64 | 0.896 |
| 32 | 0.967 | 128 | 0.986 |
| 位置嵌入维度 | MSLE | 位置嵌入维度 | MSLE |
|---|---|---|---|
| 2 | 1.134 | 16 | 1.037 |
| 4 | 1.083 | 32 | 1.042 |
| 8 | 0.896 |
Tab.8 Impact of position embedding dimension on macro prediction index value
| 位置嵌入维度 | MSLE | 位置嵌入维度 | MSLE |
|---|---|---|---|
| 2 | 1.134 | 16 | 1.037 |
| 4 | 1.083 | 32 | 1.042 |
| 8 | 0.896 |
| GRU层数 | MSLE | GRU层数 | MSLE |
|---|---|---|---|
| 1 | 0.921 | 3 | 0.912 |
| 2 | 0.896 | 4 | 0.934 |
Tab.9 Impact of number of GRU layers on macro prediction index value
| GRU层数 | MSLE | GRU层数 | MSLE |
|---|---|---|---|
| 1 | 0.921 | 3 | 0.912 |
| 2 | 0.896 | 4 | 0.934 |
| 注意力头数 | MSLE | 注意力头数 | MSLE |
|---|---|---|---|
| 2 | 0.945 | 10 | 0.901 |
| 4 | 0.091 | 12 | 0.899 |
| 6 | 0.903 | 14 | 0.904 |
| 8 | 0.896 |
Tab.10 Impact of number of attention heads on macro prediction index value
| 注意力头数 | MSLE | 注意力头数 | MSLE |
|---|---|---|---|
| 2 | 0.945 | 10 | 0.901 |
| 4 | 0.091 | 12 | 0.899 |
| 6 | 0.903 | 14 | 0.904 |
| 8 | 0.896 |
| [1] | 王震宇,朱学芳.基于多模态Transformer的虚假新闻检测研究[J].情报学报,2023,42(12):1477-1486. |
| WANG Z Y, ZHU X F. Research on fake news detection based on multimodal Transformer[J]. Journal of the China Society for Scientific and Technical Information, 2023, 42(12): 1477-1486. | |
| [2] | LI Q, XIE Y, WU X, et al. User behavior prediction model based on implicit links and multi-type rumor messages[J]. Knowledge-Based Systems, 2023, 262: No.110276. |
| [3] | WANG J, HU Y, JIANG T X, et al. Essential tensor learning for multimodal information-driven stock movement prediction[J]. Knowledge-Based Systems, 2023, 262: No.110262. |
| [4] | YANG X, YANG Y, SU J, et al. Who's next: rising star prediction via diffusion of user interest in social networks[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(5): 5413-5425. |
| [5] | LI Y, JIN H, YU X, et al. Intelligent prediction of private information diffusion in social networks[J]. Electronics, 2020, 9(5): No.719. |
| [6] | PORTILLO-VAN DIEST A, BALLESTER COMA L, MORTIER P, et al. Experience sampling methods for the personalised prediction of mental health problems in Spanish university students: protocol for a survey-based observational study within the PROMES-U project[J]. BMJ Open, 2023, 13(7): No.e072641. |
| [7] | 赵蓉英,李新来,李丹阳.高校新媒体信息传播网络结构及其演化特征[J].情报科学,2022,40(6):3-11. |
| ZHAO R Y, LI X L, LI D Y. Structure and evolution feature on information transmission network of university new media[J]. Information Science, 2022, 40(6): 3-11. | |
| [8] | 王晰巍,庄蕙荥,姜奕冰,等.重大突发事件下网络舆情传播中回声室网络结构研究[J].情报理论与实践,2024,47(1):101-109. |
| WANG X W, ZHUANG H X, JIANG Y B, et al. Research on network structure of echo chamber in network public opinion propagation for major emergencies[J]. Information Studies: Theory and Application, 2024, 47(1): 101-109. | |
| [9] | CHEN X, ZHOU F, ZHANG K, et al. Information diffusion prediction via recurrent cascades convolution[C]// Proceedings of the IEEE 35th International Conference on Data Engineering. Piscataway: IEEE, 2019: 770-781. |
| [10] | SUN X, ZHOU J, LIU L, et al. Explicit time embedding based cascade attention network for information popularity prediction[J]. Information Processing and Management, 2023, 60(3): No.103278. |
| [11] | 苗琛香,刘小洋.融合超图注意力机制与图卷积网络的信息扩散预测[J].计算机应用研究,2023,40(6):1715-1720. |
| MIAO C X, LIU X Y. Information diffusion prediction based on hypergraph attention mechanism and graph convolution network[J]. Application Research of Computers, 2023, 40(6): 1715-1720. | |
| [12] | GONG Y C, WANG M, LIANG W, et al. UHIR: an effective information dissemination model of online social hypernetworks based on user and information attributes[J]. Information Sciences, 2023, 644: No.119284. |
| [13] | 丁晟春,包舟,刘笑迎.突发事件舆情传播中用户交互行为预测研究[J].现代情报,2023,43(9):111-123. |
| DING S C, BAO Z, LIU X Y. Research on user behavior prediction of emergency public opinion communication[J]. Journal of Modern Information, 2023, 43(9): 111-123. | |
| [14] | JIA X, SHANG J, LIU D, et al. HeDAN: heterogeneous diffusion attention network for popularity prediction of online content[J]. Knowledge-Based Systems, 2022, 254: No.109659. |
| [15] | MENG F, CHEN L, HERRING P, et al. Information and disseminator features influences online negative information recognition and dissemination[J]. International Journal of Pattern Recognition and Artificial Intelligence, 2023, 37(3): No.2350005. |
| [16] | 赵小月,曾园园,江昊.基于C-DGCN的信息流行度预测[J].武汉大学学报(工学版),2023,56(4):506-514. |
| ZHAO X Y, ZENG Y Y, JIANG H. Information popularity prediction model based on C-DGCN[J]. Engineering Journal of Wuhan University, 2023, 56(4): 506-514. | |
| [17] | ZHAO J H, ZHAO J L, FENG J. Information diffusion prediction based on cascade sequences and social topology[J]. Computers and Electrical Engineering, 2023, 109(Pt B): No.108782. |
| [18] | 梁少斌,陈志豪,魏晶晶,等.基于级联时空特征的信息传播预测方法[J].模式识别与人工智能,2021,34(11):969-978. |
| LIANG S B, CHEN Z H, WEI J J, et al. Information diffusion prediction based on cascade spatial-temporal feature[J]. Pattern Recognition and Artificial Intelligence, 2021, 34(11): 969-978. | |
| [19] | 吕锡婷,赵敬华,荣海迎,等.基于Transformer和关系图卷积网络的信息传播预测模型[J].计算机应用,2024,44(6):1760-1766. |
| LYU X T, ZHAO J H, RONG H Y, et al. Information diffusion prediction model based on Transformer and relational graph convolutional network[J]. Journal of Computer Applications, 2024, 44(6): 1760-1766. | |
| [20] | WANG R, XU X, ZHANG Y. Multiscale information diffusion prediction with minimal substitution neural network[J]. IEEE Transactions on Neural Networks and Learning Systems, 2025, 36(1): 1069-1080. |
| [21] | YANG C, WANG H, TANG J, et al. Full-scale information diffusion prediction with reinforced recurrent networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2023, 34(5): 2271-2283. |
| [22] | KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[EB/OL]. [2024-09-26].. |
| [23] | 操东林. 基于深度强化学习的金融投资研究[D]. 济南:山东财经大学, 2023. |
| CAO D L. Research on financial investment based on deep reinforcement learning[D]. Jinan: Shandong University of Finance and Economics, 2023. | |
| [24] | WILLIAMS R J. Simple statistical gradient-following algorithms for connectionist reinforcement learning[J]. Machine Learning, 1992, 8(3/4): 229-256. |
| [25] | HODAS N O, LERMAN K. The simple rules of social contagion[J]. Scientific Reports, 2014, 4: No.4343. |
| [26] | ZHONG E, FAN W, WANG J, et al. ComSoc: adaptive transfer of user behaviors over composite social network[C]// Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2012: 696-704. |
| [27] | SANKAR A, ZHANG X, KRISHNAN A, et al. Inf-VAE: a variational autoencoder framework to integrate homophily and influence in diffusion prediction[C]// Proceedings of the 13th ACM International Conference on Web Search and Data Mining. New York: ACM, 2020: 510-518. |
| [28] | YANG C, SUN M, LIU H, et al. Neural diffusion model for microscopic cascade study[J]. IEEE Transactions on Knowledge and Data Engineering, 2018, 14(8): 1128-1139. |
| [29] | WANG Z, CHEN C, LI W. A sequential neural information diffusion model with structure attention[C]// Proceedings of the 27th ACM International Conference on Information and Knowledge Management. New York: ACM, 2018: 1795-1798. |
| [30] | HAMILTON W L, YING R, LESKOVEC J. Inductive representation learning on large graphs[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2017: 1025-1035. |
| [31] | LI C, MA J, GUO X, et al. DeepCas: an end-to-end predictor of information cascades[C]// Proceedings of the 26th International Conference on World Wide Web. Republic and Canton of Geneva: International World Wide Web Conferences Steering Committee, 2017: 577-586. |
| [32] | CAO Q, SHEN H, CEN K, et al. DeepHawkes: bridging the gap between prediction and understanding of information cascades[C]// Proceedings of the 2017 ACM Conference on Information and Knowledge Management. New York: ACM, 2017: 1149-1158. |
| [33] | YANG C, TANG J, SUN M, et al. Multi-scale information diffusion prediction with reinforced recurrent networks[C]// Proceedings of the 28th International Joint Conference on Artificial Intelligence. California: IJCAI.org, 2019: 4033-4039. |
| [34] | XU D, RUAN C, KORPEOGLU E, et al. Inductive representation learning on temporal graphs[EB/OL]. [2024-07-06]. . |
| [35] | LIU B, YANG D, SHI Y, et al. Improving information cascade modeling by social topology and dual role user dependency[C]// Proceedings of the 2022 International Conference on Database Systems for Advanced Applications, LNCS 13245. Cham: Springer, 2022: 425-440. |
| [36] | CHEN X, ZHANG K, ZHOU F, et al. Information cascades modeling via deep multi-task learning[C]// Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2019: 885-888. |
| [37] | 鲍鹏,徐昊.基于图注意力时空神经网络的在线内容流行度预测[J].模式识别与人工智能,2019,32(11):1014-1021. |
| BAO P, XU H. Predicting popularity of online contents via graph attention based spatial-temporal neural network[J]. Pattern Recognition and Artificial Intelligence, 2019, 32(11): 1014-1021. | |
| [38] | JIAO P, CHEN H, BAO Q, et al. Enhancing multi-scale diffusion prediction via sequential hypergraphs and adversarial learning[C]// Proceedings of the 38th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2024: 8571-8581. |
| [39] | VASWANIA, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2017: 6000-6010. |
| [1] | Chao SHI, Yuxin ZHOU, Qian FU, Wanyu TANG, Ling HE, Yuanyuan LI. Action recognition algorithm for ADHD patients using skeleton and 3D heatmap [J]. Journal of Computer Applications, 2025, 45(9): 3036-3044. |
| [2] | Ziliang LI, Guangli ZHU, Yulei ZHANG, Jiajia LIU, Yixuan JIAO, Shunxiang ZHANG. Aspect-based sentiment analysis model integrating syntax and sentiment knowledge [J]. Journal of Computer Applications, 2025, 45(6): 1724-1731. |
| [3] | Quan WANG, Qixiang LU, Pei SHI. Multi-graph diffusion attention network for traffic flow prediction [J]. Journal of Computer Applications, 2025, 45(5): 1472-1479. |
| [4] | Yufei LONG, Yuchen MOU, Ye LIU. Multi-source data representation learning model based on tensorized graph convolutional network and contrastive learning [J]. Journal of Computer Applications, 2025, 45(5): 1372-1378. |
| [5] | Man CHEN, Xiaojun YANG, Huimin YANG. Pedestrian trajectory prediction based on graph convolutional network and endpoint induction [J]. Journal of Computer Applications, 2025, 45(5): 1480-1487. |
| [6] | Weichao DANG, Chujun SONG, Gaimei GAO, Chunxia LIU. Multi-behavior recommendation based on cascading residual graph convolutional network [J]. Journal of Computer Applications, 2025, 45(4): 1223-1231. |
| [7] | Kun FU, Shicong YING, Tingting ZHENG, Jiajie QU, Jingyuan CUI, Jianwei LI. Graph data augmentation method for few-shot node classification [J]. Journal of Computer Applications, 2025, 45(2): 392-402. |
| [8] | Pengcheng SONG, Lijun GUO, Rong ZHANG. Weakly supervised video anomaly detection with local-global temporal dependency [J]. Journal of Computer Applications, 2025, 45(1): 240-246. |
| [9] | Chuanlin PANG, Rui TANG, Ruizhi ZHANG, Chuan LIU, Jia LIU, Shibo YUE. Distributed power allocation algorithm based on graph convolutional network for D2D communication systems [J]. Journal of Computer Applications, 2024, 44(9): 2855-2862. |
| [10] | Guixiang XUE, Hui WANG, Weifeng ZHOU, Yu LIU, Yan LI. Port traffic flow prediction based on knowledge graph and spatio-temporal diffusion graph convolutional network [J]. Journal of Computer Applications, 2024, 44(9): 2952-2957. |
| [11] | Huanhuan LI, Tianqiang HUANG, Xuemei DING, Haifeng LUO, Liqing HUANG. Public traffic demand prediction based on multi-scale spatial-temporal graph convolutional network [J]. Journal of Computer Applications, 2024, 44(7): 2065-2072. |
| [12] | Shibin LI, Jun GONG, Shengjun TANG. Semi-supervised heterophilic graph representation learning model based on Graph Transformer [J]. Journal of Computer Applications, 2024, 44(6): 1816-1823. |
| [13] | Xiting LYU, Jinghua ZHAO, Haiying RONG, Jiale ZHAO. Information diffusion prediction model based on Transformer and relational graph convolutional network [J]. Journal of Computer Applications, 2024, 44(6): 1760-1766. |
| [14] | Longtao GAO, Nana LI. Aspect sentiment triplet extraction based on aspect-aware attention enhancement [J]. Journal of Computer Applications, 2024, 44(4): 1049-1057. |
| [15] | Xianfeng YANG, Yilei TANG, Ziqiang LI. Aspect-level sentiment analysis model based on alternating‑attention mechanism and graph convolutional network [J]. Journal of Computer Applications, 2024, 44(4): 1058-1064. |
| Viewed | ||||||
|
Full text |
|
|||||
|
Abstract |
|
|||||