Journal of Computer Applications ›› 2026, Vol. 46 ›› Issue (1): 33-42.DOI: 10.11772/j.issn.1001-9081.2024121840
• Artificial intelligence • Previous Articles Next Articles
Hao YU1,2,3, Jing FAN1,2,3(
), Yihang SUN1,2,3, Yadong JIN1,2,3, Enkang XI1,2,3, Hua DONG1,2,3
Received:2025-01-02
Revised:2025-03-10
Accepted:2025-03-18
Online:2026-01-10
Published:2026-01-10
Contact:
Jing FAN
About author:YU Hao, born in 2000, M. S. candidate. His research interests include federated learning, split learning, edge computing.Supported by:
俞浩1,2,3, 范菁1,2,3(
), 孙伊航1,2,3, 金亚东1,2,3, 郗恩康1,2,3, 董华1,2,3
通讯作者:
范菁
作者简介:俞浩(2000—),男,湖北咸宁人,硕士研究生,CCF学生会员,主要研究方向:联邦学习、分割学习、边缘计算基金资助:CLC Number:
Hao YU, Jing FAN, Yihang SUN, Yadong JIN, Enkang XI, Hua DONG. Federated split learning optimization method under edge heterogeneity[J]. Journal of Computer Applications, 2026, 46(1): 33-42.
俞浩, 范菁, 孙伊航, 金亚东, 郗恩康, 董华. 边缘异构下的联邦分割学习优化方法[J]. 《计算机应用》唯一官方网站, 2026, 46(1): 33-42.
Add to citation manager EndNote|Ris|BibTeX
URL: https://www.joca.cn/EN/10.11772/j.issn.1001-9081.2024121840
| 符号 | 描述 |
|---|---|
| 客户端k的性能向量 | |
| 第j个边缘簇集合 | |
| 第j个边缘簇模型切割点 | |
| 第t轮次第j个边缘簇参与训练客户端集合 | |
| 客户端p与q之间的通信延迟 | |
| 客户端p和q之间的距离 | |
| 第t轮次第j边缘簇循环传递客户端集合 | |
| 第j边缘下排序后t轮次的第k个客户端局部模型 | |
| 第k个客户端局部损失函数 | |
| 第j个边缘簇排序后的局部损失总和 | |
| 客户端通信延迟阈值 |
Tab. 1 Main symbols in the paper
| 符号 | 描述 |
|---|---|
| 客户端k的性能向量 | |
| 第j个边缘簇集合 | |
| 第j个边缘簇模型切割点 | |
| 第t轮次第j个边缘簇参与训练客户端集合 | |
| 客户端p与q之间的通信延迟 | |
| 客户端p和q之间的距离 | |
| 第t轮次第j边缘簇循环传递客户端集合 | |
| 第j边缘下排序后t轮次的第k个客户端局部模型 | |
| 第k个客户端局部损失函数 | |
| 第j个边缘簇排序后的局部损失总和 | |
| 客户端通信延迟阈值 |
| 数据集 | 模型 | 学习率 | 衰减率 | 批次占比 | 客户端类别数 |
|---|---|---|---|---|---|
| FMNIST | 2 conv +2 fc | 0.10 | 0.970 | 0.06 | 2 |
| CIFAR-10 | VGG-16 | 0.03 | 0.996 | 0.06 | 2 |
| CIFAR-100 | VGG-19 | 0.03 | 0.997 | 0.06 | 20 |
Tab. 2 Datasets and configurations used in experiments
| 数据集 | 模型 | 学习率 | 衰减率 | 批次占比 | 客户端类别数 |
|---|---|---|---|---|---|
| FMNIST | 2 conv +2 fc | 0.10 | 0.970 | 0.06 | 2 |
| CIFAR-10 | VGG-16 | 0.03 | 0.996 | 0.06 | 2 |
| CIFAR-100 | VGG-19 | 0.03 | 0.997 | 0.06 | 20 |
| 方法 | 数据集 | ||
|---|---|---|---|
| FMNIST | CIFAR-10 | CIFAR-100 | |
| FedAvg | 73.1 | 42.6 | 32.1 |
| FedProx | 77.3 | 53.2 | 39.3 |
| MOON | 78.5 | 59.6 | 41.3 |
| SplitFed | 75.8 | 71.5 | 32.8 |
| SplitMix | 71.5 | 56.4 | 33.3 |
| FedCRS | 87.2 | 82.6 | 43.4 |
Tab. 3 Accuracy comparison on different datasets
| 方法 | 数据集 | ||
|---|---|---|---|
| FMNIST | CIFAR-10 | CIFAR-100 | |
| FedAvg | 73.1 | 42.6 | 32.1 |
| FedProx | 77.3 | 53.2 | 39.3 |
| MOON | 78.5 | 59.6 | 41.3 |
| SplitFed | 75.8 | 71.5 | 32.8 |
| SplitMix | 71.5 | 56.4 | 33.3 |
| FedCRS | 87.2 | 82.6 | 43.4 |
| 方法 | 达到阈值的最低轮次 | ||
|---|---|---|---|
| FMNIST | CIFAR-10 | CIFAR-100 | |
| FedAvg | 72 | ≥1 000 | 1 078 |
| FedProx | 38 | 464 | 573 |
| MOON | 55 | 363 | 485 |
| SplitFed | 32 | 212 | 1 238 |
| SplitMix | 95 | 312 | 825 |
| FedCRS | 7 | 184 | 682 |
Tab. 4 Number of communication rounds required to reach target accuracy
| 方法 | 达到阈值的最低轮次 | ||
|---|---|---|---|
| FMNIST | CIFAR-10 | CIFAR-100 | |
| FedAvg | 72 | ≥1 000 | 1 078 |
| FedProx | 38 | 464 | 573 |
| MOON | 55 | 363 | 485 |
| SplitFed | 32 | 212 | 1 238 |
| SplitMix | 95 | 312 | 825 |
| FedCRS | 7 | 184 | 682 |
| 转移权重 | 精度/% | 转移权重 | 精度/% |
|---|---|---|---|
| 0.0 | 83.5 | 0.6 | 86.8 |
| 0.2 | 86.1 | 0.8 | 84.2 |
| 0.4 | 87.2 | 1.0 | 78.1 |
Tab. 5 Impact of transfer weights on accuracy
| 转移权重 | 精度/% | 转移权重 | 精度/% |
|---|---|---|---|
| 0.0 | 83.5 | 0.6 | 86.8 |
| 0.2 | 86.1 | 0.8 | 84.2 |
| 0.4 | 87.2 | 1.0 | 78.1 |
| [1] | LI Q, WEN Z, WU Z, et al. A survey on federated learning systems: vision, hype and reality for data privacy and protection [J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(4): 3347-3366. |
| [2] | KONEČNÝ J, McMAHAN H B, RAMAGE D, et al. Federated optimization: distributed machine learning for on-device intelligence [EB/OL]. [2024-09-08]. . |
| [3] | HOU Z, CHU L, LIU C, et al. Towards fair federated learning [C]// Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York: ACM, 2021: 4100-4101. |
| [4] | ZHANG X, SUN W, CHEN Y. Tackling the non-IID issue in heterogeneous federated learning by gradient harmonization [J]. IEEE Signal Processing Letters, 2024, 31: 2595-2599. |
| [5] | 俞浩,范菁,孙伊航,等.联邦学习统计异质性综述[J].计算机应用, 2025, 45(9): 2737-2746. |
| YU H, FAN J, SUN Y H, et al. Survey of statistical heterogeneity in federated learning [J]. Journal of Computer Applications, 2025, 45(9): 2737-2746. | |
| [6] | LI T, SAHU A K, ZAHEER M, et al. Federated optimization in heterogeneous networks [EB/OL]. [2024-09-09]. . |
| [7] | KARIMIREDDY S P, KALE S, MOHRI M, et al. SCAFFOLD: stochastic controlled averaging for on-device federated learning [C]// Proceedings of the 37th International Conference on Machine Learning. New York: JMLR.org, 2020: 5132-5143. |
| [8] | JIA Y, ZHANG N. Research and implementation of asynchronous transmission and update strategy for federated learning [C]// Proceedings of the IEEE 8th International Conference on Computer and Communications. Piscataway: IEEE, 2022: 1281-1286. |
| [9] | LIANG Y, OUYANG C, CHEN X. Adaptive asynchronous federated learning for heterogeneous clients [C]// Proceedings of the 18th International Conference on Computational Intelligence and Security. Piscataway: IEEE, 2022: 399-403. |
| [10] | MACIEL F, DE SOUZA A M, BITTENCOURT L F, et al. Federated learning energy saving through client selection [J]. Pervasive and Mobile Computing, 2024, 103: No.101948. |
| [11] | AJAO A, JONATHAN O, ADETIBA E. The applications of federated learning algorithm in the federated cloud environment: a systematic review [C]// Proceedings of the 2024 International Conference on Science, Engineering and Business for Driving Sustainable Development Goals. Piscataway: IEEE, 2024: 1-15. |
| [12] | LIU X, DENG Y, MAHMOODI T. Wireless distributed learning: a new hybrid split and federated learning approach [J]. IEEE Transactions on Wireless Communications, 2023, 22(4): 2650-2665. |
| [13] | GUPTA O, RASKAR R. Distributed learning of deep neural network over multiple agents [J]. Journal of Network and Computer Applications, 2018, 116: 1-8. |
| [14] | AYAD A, RENNER M, SCHMEINK A. Improving the communication and computation efficiency of split learning for IoT applications [C]// Proceedings of the 2021 IEEE Global Communications Conference. Piscataway: IEEE, 2021: 1-6. |
| [15] | JEON J, KIM J. Privacy-sensitive parallel split learning [C]// Proceedings of the 2020 International Conference on Information Networking. Piscataway: IEEE, 2020: 7-9. |
| [16] | BARHOUSH M, AYAD A, SCHMEINK A. Semi-supervised learning in distributed split learning architecture and IoT applications [C]// Proceedings of the IEEE 15th International Symposium on Autonomous Decentralized System. Piscataway: IEEE, 2023: 1-6. |
| [17] | THAPA C, ARACHCHIGE P C M, CAMTEPE S, et al. SplitFed: when federated learning meets split learning [C]// Proceedings of the 36th AAAI Conference on Artificial Intelligence. Palo Alto: AAAI Press, 2022: 8485-8493. |
| [18] | KHAN L U, GUIZANI M, AL-FUQAHA A, et al. A joint communication and learning framework for hierarchical split federated learning [J]. IEEE Internet of Things Journal, 2024, 11(1): 268-282. |
| [19] | LUO M, CHEN F, HU D, et al. No fear of heterogeneity: classifier calibration for federated learning with non-IID data [C]// Proceedings of the 35th International Conference on Neural Information Processing Systems. Red Hook: Curran Associates Inc., 2021: 5972-5984. |
| [20] | DIAO E, DING J, TAROKH V. HeteroFL: computation and communication efficient federated learning for heterogeneous clients [EB/OL]. [2024-09-08]. . |
| [21] | YANG H, XI W, WANG Z, et al. FedRich: towards efficient federated learning for heterogeneous clients using heuristic scheduling [J]. Information Sciences, 2023, 645: No.119360. |
| [22] | WANG K I K, YE X, SAKURAI K. Federated learning with clustering-based participant selection for IoT applications [C]// Proceedings of the 2022 IEEE International Conference on Big Data. Piscataway: IEEE, 2022: 6830-6831. |
| [23] | XU C, QU Y, LUAN T H, et al. An efficient and reliable asynchronous federated learning scheme for smart public transportation [J]. IEEE Transactions on Vehicular Technology, 2023, 72(5): 6584-6598. |
| [24] | LUO L, ZHANG X. Federated split learning via mutual knowledge distillation [J]. IEEE Transactions on Network Science and Engineering, 2024, 11(3): 2729-2741. |
| [25] | WANG Z, LIN H, LIU Q, et al. FedCST: federated learning on heterogeneous resource-constrained devices using clustering and split training [C]// Proceedings of the IEEE 24th International Conference on Software Quality, Reliability, and Security Companion. Piscataway: IEEE, 2024: 786-792. |
| [26] | LIU L, ZHANG J, SONG S H, et al. Client-edge-cloud hierarchical federated learning [C]// Proceedings of the 2020 IEEE International Conference on Communications. Piscataway: IEEE, 2020: 1-6. |
| [27] | LI Q, HE B, SONG D. Model-contrastive federated learning [C]// Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2021: 10708-10717. |
| [28] | HONG J, WANG H, WANG Z, et al. Efficient split-mix federated learning for on-demand and in-situ customization [EB/OL]. [2025-01-09]. . |
| [1] | Kejia ZHANG, Zhijun FANG, Nanrun ZHOU, Zhicai SHI. Personalized federated learning method based on model pre-assignment and self-distillation [J]. Journal of Computer Applications, 2026, 46(1): 10-20. |
| [2] | Hao YU, Jing FAN, Yihang SUN, Hua DONG, Enkang XI. Survey of statistical heterogeneity in federated learning [J]. Journal of Computer Applications, 2025, 45(9): 2737-2746. |
| [3] | Jintao SU, Lina GE, Liguang XIAO, Jing ZOU, Zhe WANG. Detection and defense scheme for backdoor attacks in federated learning [J]. Journal of Computer Applications, 2025, 45(8): 2399-2408. |
| [4] | Lina GE, Mingyu WANG, Lei TIAN. Review of research on efficiency of federated learning [J]. Journal of Computer Applications, 2025, 45(8): 2387-2398. |
| [5] | Yinchuan TU, Yong GUO, Heng MAO, Yi REN, Jianfeng ZHANG, Bao LI. Evaluation of training efficiency and training performance of graph neural network models based on distributed environment [J]. Journal of Computer Applications, 2025, 45(8): 2409-2420. |
| [6] | Hongwei FAN, Woping XU. Resource allocation for relay in intelligent reflecting surface assisted wireless powered communication networks [J]. Journal of Computer Applications, 2025, 45(5): 1619-1624. |
| [7] | Yiming ZHANG, Tengfei CAO. Federated learning optimization algorithm based on local drift and diversity computing power [J]. Journal of Computer Applications, 2025, 45(5): 1447-1454. |
| [8] | Qingli CHEN, Yuanbo GUO, Chen FANG. Clustering federated learning algorithm for heterogeneous data [J]. Journal of Computer Applications, 2025, 45(4): 1086-1094. |
| [9] | Haili LIN, Jing LI. Lazy client identification method in federated learning based on proof-of-work [J]. Journal of Computer Applications, 2025, 45(3): 856-863. |
| [10] | Yan LI, Guanhua YE, Yawen LI, Meiyu LIANG. Enterprise ESG indicator prediction model based on richness coordination technology [J]. Journal of Computer Applications, 2025, 45(2): 670-676. |
| [11] | Zhiqiang REN, Xuebin CHEN. FedAud: adaptive defense mechanism based on historical model updates [J]. Journal of Computer Applications, 2025, 45(2): 490-496. |
| [12] | Ruilong CHEN, Peng YI, Tao HU, Youjun BU. Encrypted traffic classification method based on federated prototypical incremental learning [J]. Journal of Computer Applications, 2025, 45(12): 3888-3895. |
| [13] | Zhongrui JING, Xuebin CHEN, Yinlong JIAN, Qi ZHONG, Zhenbo ZHANG. Federated learning fairness algorithm based on personalized submodel and K-means clustering [J]. Journal of Computer Applications, 2025, 45(12): 3747-3756. |
| [14] | Jian YUN, Xinru GAO, Tao LIU, Wenjie BI. New federated learning scheme balancing efficiency and security [J]. Journal of Computer Applications, 2025, 45(11): 3510-3518. |
| [15] | Shufen ZHANG, Hongyang ZHANG, Zhiqiang REN, Xuebin CHEN. Survey of fairness in federated learning [J]. Journal of Computer Applications, 2025, 45(1): 1-14. |
| Viewed | ||||||
|
Full text |
|
|||||
|
Abstract |
|
|||||