Journal of Computer Applications ›› 2026, Vol. 46 ›› Issue (2): 368-377.DOI: 10.11772/j.issn.1001-9081.2025020256
• Artificial intelligence • Previous Articles
Haoqian JIANG1, Dong ZHANG1, Guanyu LI1(
), Heng CHEN2
Received:2025-03-17
Revised:2025-05-20
Accepted:2025-05-28
Online:2025-06-10
Published:2026-02-10
Contact:
Guanyu LI
About author:JIANG Haoqian, born in 2001, M. S. candidate. His research interests include intelligent information processing, recommender system.Supported by:通讯作者:
李冠宇
作者简介:姜皓骞(2001—),男,辽宁锦州人,硕士研究生,CCF会员,主要研究方向:智能信息处理、推荐系统基金资助:CLC Number:
Haoqian JIANG, Dong ZHANG, Guanyu LI, Heng CHEN. SetaCRS: Conversational recommender system with structure-enhanced hierarchical task-oriented prompting strategy[J]. Journal of Computer Applications, 2026, 46(2): 368-377.
姜皓骞, 张东, 李冠宇, 陈恒. 基于结构增强的层次化任务导向提示策略的对话推荐系统SetaCRS[J]. 《计算机应用》唯一官方网站, 2026, 46(2): 368-377.
Add to citation manager EndNote|Ris|BibTeX
URL: https://www.joca.cn/EN/10.11772/j.issn.1001-9081.2025020256
| 数据集 | 对话数量 | 对话轮次 | 对话目标 | 对话主题 | 候选物品 |
|---|---|---|---|---|---|
| TG-ReDial | |||||
| DuRecDial | 10 190 | 155 477 |
Tab. 1 Introduction of datasets
| 数据集 | 对话数量 | 对话轮次 | 对话目标 | 对话主题 | 候选物品 |
|---|---|---|---|---|---|
| TG-ReDial | |||||
| DuRecDial | 10 190 | 155 477 |
| 模型 | NDCG@10 | NDCG@50 | MRR@10 | MRR@50 |
|---|---|---|---|---|
| SASRec | 0.369 | 0.413 | 0.307 | 0.317 |
| BERT | 0.546 | 0.572 | 0.498 | 0.504 |
| LLAMA-13b | 0.027 | 0.031 | 0.024 | 0.024 |
| Union | 0.557 | 0.583 | 0.510 | 0.516 |
| ChatGPT | 0.024 | 0.035 | 0.018 | 0.020 |
| GRU4Rec | 0.219 | 0.273 | 0.171 | 0.183 |
| TextCNN | 0.505 | 0.534 | 0.452 | 0.458 |
| UniMIND | 0.634 | 0.647 | 0.629 | 0.632 |
| SetaCRS | 0.664 | 0.700 | 0.648 | 0.659 |
Tab. 2 NDCG and MRR results on DuRecDial dataset
| 模型 | NDCG@10 | NDCG@50 | MRR@10 | MRR@50 |
|---|---|---|---|---|
| SASRec | 0.369 | 0.413 | 0.307 | 0.317 |
| BERT | 0.546 | 0.572 | 0.498 | 0.504 |
| LLAMA-13b | 0.027 | 0.031 | 0.024 | 0.024 |
| Union | 0.557 | 0.583 | 0.510 | 0.516 |
| ChatGPT | 0.024 | 0.035 | 0.018 | 0.020 |
| GRU4Rec | 0.219 | 0.273 | 0.171 | 0.183 |
| TextCNN | 0.505 | 0.534 | 0.452 | 0.458 |
| UniMIND | 0.634 | 0.647 | 0.629 | 0.632 |
| SetaCRS | 0.664 | 0.700 | 0.648 | 0.659 |
| 模型 | NDCG@10 | NDCG@50 | MRR@10 | MRR@50 |
|---|---|---|---|---|
| SASRec | 0.009 2 | 0.017 9 | 0.005 0 | 0.006 8 |
| GRU4Rec | 0.002 8 | 0.006 2 | 0.001 4 | 0.002 0 |
| MHIM | 0.015 2 | 0.025 6 | 0.010 8 | 0.012 9 |
| BERT | 0.024 6 | 0.043 9 | 0.018 2 | 0.021 1 |
| LLMCRS(Flan-T5) | 0.015 9 | 0.026 1 | 0.012 8 | 0.013 8 |
| Union | 0.034 8 | 0.052 7 | 0.024 0 | 0.027 7 |
| UniMIND | 0.038 6 | 0.063 8 | 0.028 3 | 0.031 9 |
| TextCNN | 0.014 4 | 0.021 5 | 0.011 9 | 0.013 3 |
| SetaCRS | 0.040 3 | 0.064 1 | 0.031 0 | 0.035 7 |
Tab. 3 NDCG and MRR results on TG-ReDial dataset
| 模型 | NDCG@10 | NDCG@50 | MRR@10 | MRR@50 |
|---|---|---|---|---|
| SASRec | 0.009 2 | 0.017 9 | 0.005 0 | 0.006 8 |
| GRU4Rec | 0.002 8 | 0.006 2 | 0.001 4 | 0.002 0 |
| MHIM | 0.015 2 | 0.025 6 | 0.010 8 | 0.012 9 |
| BERT | 0.024 6 | 0.043 9 | 0.018 2 | 0.021 1 |
| LLMCRS(Flan-T5) | 0.015 9 | 0.026 1 | 0.012 8 | 0.013 8 |
| Union | 0.034 8 | 0.052 7 | 0.024 0 | 0.027 7 |
| UniMIND | 0.038 6 | 0.063 8 | 0.028 3 | 0.031 9 |
| TextCNN | 0.014 4 | 0.021 5 | 0.011 9 | 0.013 3 |
| SetaCRS | 0.040 3 | 0.064 1 | 0.031 0 | 0.035 7 |
| 回复任务 | F1/% | BLEU-1 | BLEU2 | Dist-2 |
|---|---|---|---|---|
| UniMIND | 52.19 | 0.479 | 0.398 | 0.079 |
| TPNet | NA | 0.308 | 0.217 | 0.093 |
| MGCG | 42.04 | 0.362 | 0.252 | 0.081 |
| GPT-2 | 47.01 | 0.392 | 0.295 | 0.165 |
| KERS | 50.47 | 0.463 | 0.362 | 0.079 |
| BART | 48.41 | 0.418 | 0.328 | 0.049 |
| GOKC | 47.28 | 0.413 | 0.318 | 0.084 |
| SetaCRS | 53.00 | 0.506 | 0.426 | 0.086 |
Tab. 4 F1,BLEU and Dist results on DuRecDial dataset
| 回复任务 | F1/% | BLEU-1 | BLEU2 | Dist-2 |
|---|---|---|---|---|
| UniMIND | 52.19 | 0.479 | 0.398 | 0.079 |
| TPNet | NA | 0.308 | 0.217 | 0.093 |
| MGCG | 42.04 | 0.362 | 0.252 | 0.081 |
| GPT-2 | 47.01 | 0.392 | 0.295 | 0.165 |
| KERS | 50.47 | 0.463 | 0.362 | 0.079 |
| BART | 48.41 | 0.418 | 0.328 | 0.049 |
| GOKC | 47.28 | 0.413 | 0.318 | 0.084 |
| SetaCRS | 53.00 | 0.506 | 0.426 | 0.086 |
| 回复任务 | F1/% | BLEU-1 | BLEU-2 | Dist-2 |
|---|---|---|---|---|
| UniMIND | 35.62 | 0.314 | 0.090 | 0.198 |
| GPT-2 | NA | 0.279 | 0.066 | 0.094 |
| COLA | NA | NA | 0.025 | 0.151 |
| BART | 32.80 | 0.291 | 0.070 | 0.097 |
| Union | NA | 0.280 | 0.065 | 0.094 |
| SetaCRS | 38.66 | 0.341 | 0.091 | 0.186 |
Tab. 5 F1,BLEU and Dist results on TG-ReDial dataset
| 回复任务 | F1/% | BLEU-1 | BLEU-2 | Dist-2 |
|---|---|---|---|---|
| UniMIND | 35.62 | 0.314 | 0.090 | 0.198 |
| GPT-2 | NA | 0.279 | 0.066 | 0.094 |
| COLA | NA | NA | 0.025 | 0.151 |
| BART | 32.80 | 0.291 | 0.070 | 0.097 |
| Union | NA | 0.280 | 0.065 | 0.094 |
| SetaCRS | 38.66 | 0.341 | 0.091 | 0.186 |
| [1] | JUNG Y, JUNG E, CHEN L. Towards a unified conversational recommendation system: multi-task learning via contextualized knowledge distillation[C]// Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2023: 13625-13637. |
| [2] | WANG X, ZHOU K, WEN J R, et al. Towards unified conversational recommender systems via knowledge-enhanced prompt learning[C]// Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York: ACM, 2022: 1929-1937. |
| [3] | HINTON G, VINYALS O, DEAN J. Distilling the knowledge in a neural network[EB/OL]. [2024-12-23].. |
| [4] | LEE G, KANG S, KWEON W, et al. Continual collaborative distillation for recommender system[C]// Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York: ACM, 2024: 1495-1505. |
| [5] | RAVAUT M, ZHANG H, XU L, et al. Parameter-efficient conversational recommender system as a language processing task[C]// Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg: ACL, 2024: 152-165. |
| [6] | DENG Y, ZHANG W, XU W, et al. A unified multi-task learning framework for multi-goal conversational recommender systems[J]. ACM Transactions on Information Systems, 2023, 41(3): No.77. |
| [7] | YANG T, CHEN L. Unleashing the retrieval potential of large language models in conversational recommender systems[C]// Proceedings of the 18th ACM Conference on Recommender Systems. New York: ACM, 2024: 43-52. |
| [8] | ZHU Y, WAN C, STECK H, et al. Collaborative retrieval for large language model-based conversational recommender systems[EB/OL]. [2025-02-23].. |
| [9] | LI C, DENG Y, HU H, et al. ChatCRS: incorporating external knowledge and goal guidance for LLM-based conversational recommender systems[C]// Findings of the Association for Computational Linguistics: NAACL 2025. Stroudsburg: ACL, 2025: 295-312. |
| [10] | WANG L, HU H, SHA L, et al. RecInDial: a unified framework for conversational recommendation with pretrained language models[C]// Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg: ACL, 2022: 489-500. |
| [11] | FRIEDMAN L, AHUJA S, ALLEN D, et al. Leveraging large language models in conversational recommender systems[EB/OL]. [2024-12-23].. |
| [12] | GU X, YOO K M, HA J W. DialogBERT: discourse-aware response generation via learning to recover and rank utterances[C]// Proceedings of the 35th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2021: 12911-12919. |
| [13] | LI J, TANG T, ZHAO W X, et al. Pre-trained language models for text generation: a survey[J]. ACM Computing Surveys, 2024, 56(9): No.230. |
| [14] | XU Y, ZHAO H. Dialogue-oriented pre-training[C]// Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Stroudsburg: ACL, 2021: 2663-2673. |
| [15] | LIU P, YUAN W, FU J, et al. Pre-train, prompt, and predict: a systematic survey of prompting methods in natural language processing[J]. ACM Computing Surveys, 2023, 55(9): No.195. |
| [16] | LV Q, DING M, LIU Q, et al. Are we really making much progress? Revisiting, benchmarking and refining heterogeneous graph neural networks[C]// Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York: ACM, 2021: 1150-1160. |
| [17] | CHEN K, SUN S. CP-Rec: contextual prompting for conversational recommender systems[C]// Proceedings of the 37th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2023: 12635-12643. |
| [18] | ZHANG X, JIA X, LIU H, et al. A goal interaction graph planning framework for conversational recommendation[C]// Proceedings of the 38th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2024: 19578-19587. |
| [19] | LIU S, AO Z, CHEN P, et al. CollRec: pre-trained language models and knowledge graphs collaborate to enhance conversational recommendation system[J]. IEEE Access, 2024, 12: 104663-104675. |
| [20] | ZHENG Y, CHEN Z, QIN J, et al. FacetCRS: multi-faceted preference learning for pricking filter bubbles in conversational recommender system[C]// Proceedings of the 38th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2024: 9405-9413. |
| [21] | WANG Y, ZHANG Y, ZHU J, et al. MCKP: multi-aspect contextual knowledge-enhanced prompting for conversational recommender systems[J]. Information Sciences, 2025, 686: No.121315. |
| [22] | WANG T C, SU S Y, CHEN Y N. BARCOR: towards a unified framework for conversational recommendation systems[EB/OL]. [2024-12-23].. |
| [23] | ZHANG Y, SUN S, GALLEY M, et al. DialoGPT: large-scale generative pre-training for conversational response generation[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations. Stroudsburg: ACL, 2020: 270-278. |
| [24] | YANG H, ZHANG Y, XU J, et al. Unveiling the generalization power of fine-tuned large language models[C]// Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers). Stroudsburg: ACL, 2024: 884-899. |
| [25] | RADFORD A, NARASIMHAN K, SALIMANS T, et al. Improving language understanding by generative pre-training[EB/OL]. [2024-12-23].. |
| [26] | RAFFEL C, SHAZEER N, ROBERTS A, et al. Exploring the limits of transfer learning with a unified text-to-text Transformer[J]. Journal of Machine Learning Research, 2020, 21: 1-67. |
| [27] | BROWN T B, MANN B, RYDER N, et al. Language models are few-shot learners[C]// Proceedings of the 34th International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2020: 1877-1901. |
| [28] | LESTER B, AL-RFOU R, CONSTANT N. The power of scale for parameter-efficient prompt tuning[C]// Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2021: 3045-3059. |
| [29] | DENG M, WANG J, HSIEH C P, et al. RLPrompt: optimizing discrete text prompts with reinforcement learning[C]// Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing. Stroudsburg: ACL, 2022: 3369-3391. |
| [30] | PEROZZI B, AL-RFOU R, SKIENA S. DeepWalk: online learning of social representations[C]// Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2014: 701-710. |
| [31] | GROVER A, LESKOVEC J. node2vec: Scalable feature learning for networks[C]// Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2016: 855-864. |
| [32] | SCHLICHTKRULL M, KIPF T N, BLOEM P, et al. Modeling relational data with graph convolutional networks[C]// Proceedings of the 2018 International Conference on Semantic Web, LNCS 10843. Cham: Springer, 2018: 593-607. |
| [33] | VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2017: 6000-6010. |
| [34] | WANG X, JI H, SHI C, et al. Heterogeneous graph attention network[C]// Proceedings of the World Wide Web Conference 2019. New York: ACM, 2019: 2022-2032. |
| [35] | KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[EB/OL]. [2024-12-23].. |
| [36] | SHAO Y, GENG Z, LIU Y, et al. CPT: a pre-trained unbalanced transformer for both Chinese language understanding and generation[J]. SCIENCE CHINA Information Sciences, 2024, 67(5): No.152102. |
| [37] | DEVLIN J, CHANG M W, LEE K, et al. BERT: pre-training of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). Stroudsburg: ACL, 2019: 4171-4186. |
| [38] | SUN Z, WANG M, LI L. Multilingual translation via grafting pre-trained language models[C]// Findings of the Association for Computational Linguistics: EMNLP 2021. Stroudsburg: ACL, 2021: 2735-2747. |
| [39] | ANDREAS V M, WINATA G I, PURWARIANTI A. A comparative study on language models for task-oriented dialogue systems[C]// Proceedings of the 8th International Conference on Advanced Informatics: Concepts, Theory and Applications. Piscataway: IEEE, 2021: 1-5. |
| [40] | MUIA C M, OIRERE A M, NDUNG’U R N. A comparative study of Transformer-based models for text summarization of news articles[J]. International Journal of Advanced Trends in Computer Science and Engineering, 2024, 13(2): 37-43. |
| [41] | LIU Z, WANG H, NIU Z Y, et al. Towards conversational recommendation over multi-type dialogs[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2020: 1036-1049. |
| [42] | ZHOU K, ZHOU Y, ZHAO W X, et al. Towards topic-guided conversational recommender system[C]// Proceedings of the 28th International Conference on Computational Linguistics. [S.l.]: International Committee on Computational Linguistics, 2020: 4128-4139. |
| [43] | KANG W C, McAULEY J. Self-attentive sequential recommendation[C]// Proceedings of the 2018 IEEE International Conference on Data Mining. Piscataway: IEEE, 2018: 197-206. |
| [44] | RADFORD A, WU J, CHILD R, et al. Language models are unsupervised multitask learners[EB/OL]. [2024-12-23].. |
| [45] | LIN D, WANG J, LI W. COLA: improving conversational recommender systems by collaborative augmentation[C]// Proceedings of the 37th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2023: 4462-4470. |
| [46] | LIU Q, WU S, WANG D, et al. Context-aware sequential recommendation[C]// Proceedings of the IEEE 16th International Conference on Data Mining. Piscataway: IEEE, 2016: 1053-1058. |
| [47] | SHANG C, HOU Y, ZHAO W X, et al. Multi-grained hypergraph interest modeling for conversational recommendation[J]. AI Open, 2023, 4: 154-164. |
| [48] | FENG Y, LIU S, XUE Z, et al. A large language model enhanced conversational recommender system[EB/OL]. [2024-12-23].. |
| [49] | LEWIS M, LIU Y, GOYAL N, et al. BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension[C]//. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2020: 7871-7880. |
| [50] | WANG J, LIN D, LI W. A target-driven planning approach for goal-directed dialog systems[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024, 35(8): 10475-10487. |
| [51] | ZHANG J, YANG Y, CHEN C, et al. KERS: a knowledge-enhanced framework for recommendation dialog systems with multiple subgoals[C]// Findings of the Association for Computational Linguistics: EMNLP 2021. Stroudsburg: ACL, 2021: 1092-1101. |
| [52] | BAI J, YANG Z, LIANG X, et al. Learning to copy coherent knowledge for response generation[C]// Proceedings of the 35th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2021: 12535-12543. |
| [53] | TOUVRON H, MARTIN L, STONE K, et al. Llama 2: open foundation and fine-tuned chat models[EB/OL]. [2024-12-23].. |
| [54] | RAKHLIN A. Convolutional neural networks for sentence classification in Keras[EB/OL]. [2024-12-23].. |
| [55] | PAPINENI K, ROUKOS S, WARD T, et al. BLEU: a method for automatic evaluation of machine translation[C]// Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics. Stroudsburg: ACL, 2002: 311-318. |
| [56] | LI J, GALLEY M, BROCKETT C, et al. A diversity-promoting objective function for neural conversation models[C]// Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg: ACL, 2016: 110-119. |
| [57] | PAREJA A, DOMENICONI G, CHEN J, et al. EvolveGCN: evolving graph convolutional networks for dynamic graphs[C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2020: 5363-5370. |
| [58] | 任红格,李冬梅,李福进. 基于半监督学习的动态神经网络结构设计[J]. 计算机应用, 2016, 36(3): 703-707, 734. |
| REN H G, LI D M, LI F J. Dynamic neural network structure design based on semi-supervised learning[J]. Journal of Computer Applications, 2016, 36(3): 703-707, 734. | |
| [59] | 彭梓航,张全贵,金海波,等. 基于时间块动态图神经网络的序列推荐方法[J]. 计算机应用研究, 2025, 42(8): 2312-2319. |
| PENG Z H, ZHANG Q G, JIN H B, et al. Time-block-based dynamic graph neural network for sequential recommendation[J]. Application Research of Computers, 2025, 42(8): 2312-2319. |
| [1] | Xinran XIE, Zhe CUI, Rui CHEN, Tailai PENG, Dekun LIN. Zero-shot re-ranking method by large language model with hierarchical filtering and label semantic extension [J]. Journal of Computer Applications, 2026, 46(1): 60-68. |
| [2] | Xiaoyong BIAN, Peiyang YUAN, Qiren HU. Dual-coding space-frequency mixing method for infrared small target detection [J]. Journal of Computer Applications, 2026, 46(1): 252-259. |
| [3] | Fan HE, Li LI, Zhongxu YUAN, Xiu YANG, Dongxuan HAN. Knowledge tracking model based on concept association memory network with graph attention [J]. Journal of Computer Applications, 2026, 46(1): 43-51. |
| [4] | Zhihui ZAN, Yajing WANG, Ke LI, Zhixiang YANG, Guangyu YANG. Multi-feature fusion speech emotion recognition method based on SAA-CNN-BiLSTM network [J]. Journal of Computer Applications, 2026, 46(1): 69-76. |
| [5] | Wen LI, Kairong LI, Kai YANG. Subgraph-aware contrastive learning with data augmentation [J]. Journal of Computer Applications, 2026, 46(1): 1-9. |
| [6] | Xingyao YANG, Zheng QI, Jiong YU, Zulian ZHANG, Shuai MA, Hongtao SHEN. Session-based recommendation model based on time-aware and space-enhanced dual channel graph neural network [J]. Journal of Computer Applications, 2026, 46(1): 104-112. |
| [7] | Yonghao LIANG, Jinlong LI. Novel message passing network for neural Boolean satisfiability problem solver [J]. Journal of Computer Applications, 2025, 45(9): 2934-2940. |
| [8] | Chao LIU, Yanhua YU. Knowledge-aware recommendation model combining denoising strategy and multi-view contrastive learning [J]. Journal of Computer Applications, 2025, 45(9): 2827-2837. |
| [9] | Hongjun ZHANG, Gaojun PAN, Hao YE, Yubin LU, Yiheng MIAO. Multi-source heterogeneous data analysis method combining deep learning and tensor decomposition [J]. Journal of Computer Applications, 2025, 45(9): 2838-2847. |
| [10] | Jin LI, Liqun LIU. SAR and visible image fusion based on residual Swin Transformer [J]. Journal of Computer Applications, 2025, 45(9): 2949-2956. |
| [11] | Bing YIN, Zhenhua LING, Yin LIN, Changfeng XI, Ying LIU. Emotion recognition method compatible with missing modal reasoning [J]. Journal of Computer Applications, 2025, 45(9): 2764-2772. |
| [12] | Yanqun LU, Yiyi ZHAO. Customer churn prediction model integrating hierarchical graph neural network and specific feature learning [J]. Journal of Computer Applications, 2025, 45(9): 3057-3066. |
| [13] | Weigang LI, Jiale SHAO, Zhiqiang TIAN. Point cloud classification and segmentation network based on dual attention mechanism and multi-scale fusion [J]. Journal of Computer Applications, 2025, 45(9): 3003-3010. |
| [14] | Zhixiong XU, Bo LI, Xiaoyong BIAN, Qiren HU. Adversarial sample embedded attention U-Net for 3D medical image segmentation [J]. Journal of Computer Applications, 2025, 45(9): 3011-3016. |
| [15] | Panfeng JING, Yudong LIANG, Chaowei LI, Junru GUO, Jinyu GUO. Semi-supervised image dehazing algorithm based on teacher-student learning [J]. Journal of Computer Applications, 2025, 45(9): 2975-2983. |
| Viewed | ||||||
|
Full text |
|
|||||
|
Abstract |
|
|||||