1.School of Computing and Artificial Intelligence, Southwest Jiaotong University, Chengdu Sichuan 611756, China 2.Sichuan Key Laboratory of Cloud Computing and Intelligent Technique (Southwest Jiaotong University), Chengdu Sichuan 611756, China
Contact:
LI Tianrui, born in 1969, Ph. D., professor. His research interests include big data, cloud computing, data mining, machine learning, granular computing, rough sets.
About author:ZHENG Sai, born in 1996, M. S. candidate. His research interests include federated learning, machine learning;LI Tianrui, born in 1969, Ph. D., professor. His research interests include big data, cloud computing, data mining, machine learning, granular computing, rough sets;HUANG Wei, born in 1994, Ph. D. candidate. Her research interests include federated learning, data mining;
Supported by:
This work is partially supported by National Key Research and Development Program of China (2019YFB2101802), National Natural Science Foundation of China (62176221).
ZHENG Sai, LI Tianrui, HUANG Wei. Federated learning algorithm for communication cost optimization[J]. Journal of Computer Applications, 2023, 43(1): 1-7.
1 LeCUN Y, BENGIO Y, HINTON G. Deep learning[J]. Nature, 2015, 521(7553): 436-444. 10.1038/nature14539 2 McMAHAN H B, MOORE E, RAMAGE D, et al. Communication? efficient learning of deep networks from decentralized data[C]// Proceedings of the 20th International Conference on Artificial Intelligence and Statistics. New York: JMLR.org, 2017: 1273-1282. 3 KAIROUZ P, MCMAHAN H B, AVENT B, et al. Advances and open problems in federated learning[J]. Foundations and Trends? in Machine Learning, 2021, 14(1/2): 1-210. 10.1561/2200000083 4 HARD A, RAO K, MATHEWS R, et al. Federated learning for mobile keyboard prediction[EB/OL]. (2019-02-28) [2021-03-28].https://arxiv.org/pdf/1811.03604.pdf. 5 史鼎元,王晏晟,郑鹏飞,等. 面向企业数据孤岛的联邦排序学习[J]. 软件学报, 2021, 32(3): 669-688. 10.13328/j.cnki.jos.006174 SHI D Y, WANG Y S, ZHENG P F, et al. Cross?silo federated learning?to?rank[J]. Journal of Software, 2021, 32(3): 669-688. 10.13328/j.cnki.jos.006174 6 MUHAMMAD K, WANG Q Q, O'REILLY?MORGAN D, et al. FedFast: going beyond average for faster training of federated recommender systems[C]// Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2020: 1234-1242. 10.1145/3394486.3403176 7 LI T, SAHU A K, TALWALKAR A, et al. Federated learning: challenges, methods, and future directions[J]. IEEE Signal Processing Magazine, 2020, 37(3): 50-60. 10.1109/msp.2020.2975749 8 LI T, SAHU A K, ZAHEER M, et al. Federated optimization in heterogeneous networks[C/OL]// Proceedings of the 3rd Machine Learning and Systems Conference. [2022-01-21].https://proceedings.mlsys.org/paper/2020/file/38af86134b65d0f10fe33d30 dd76442e-Paper.pdf. 10.1109/ieeeconf44664.2019.9049023 9 KULKARNI V, KULKARNI M, PANT A. Survey of personalization techniques for federated learning[C]// Proceedings of the 4th World Conference on Smart Trends in Systems, Security and Sustainability. Piscataway: IEEE, 2020: 794-797. 10.1109/worlds450073.2020.9210355 10 PAN S J, YANG Q. A survey on transfer learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2010, 22(10): 1345-1359. 10.1109/tkde.2009.191 11 HINTON G, VINYALS O, DEAN J. Distilling the knowledge in a neural network[EB/OL]. (2015-03-09) [2021-04-23].https://arxiv.org/pdf/1503.02531.pdf. 12 KHODAK M, BALCAN M F, TALWALKAR A. Adaptive gradient?based meta?learning methods[C/OL]// Proceedings of the 33rd Conference on Neural Information Processing Systems. [2021-04-23].https://proceedings.neurips.cc/paper/2019/file/f4aa0dd960521e045ae2f20621fb4ee9-Paper.pdf. 13 SMITH V, CHIANG C K, SANJABI M, et al. Federated multi? task learning[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2017: 4427-4437. 14 WANG K K, MATHEWS R, KIDDON C, et al. Federated evaluation of on?device personalization[EB/OL]. (2019-10-22) [2022-01-22].https://arxiv.org/pdf/1910.10252.pdf. 15 YU T, BAGDASARYAN E, SHMATIKOV V. Salvaging federated learning by local adaptation[EB/OL]. (2022-03-03) [2022-03-22].https://arxiv.org/pdf/2002.04758.pdf. 16 LI D L, WANG J P. FedMD: heterogenous federated learning via model distillation[EB/OL]. (2019-10-08) [2022-01-19].https://arxiv.org/pdf/1910.03581.pdf. 17 YAO X, HUANG C F, SUN L F. Two?stream federated learning: reduce the communication costs[C]// Proceedings of the 2018 IEEE International Conference on Visual Communications and Image Processing. Piscataway: IEEE, 2018: 1-4. 10.1109/vcip.2018.8698609 18 CALDAS S, KONE?NY J, McMAHAN H B, et al. Expanding the reach of federated learning by reducing client resource requirements[EB/OL]. (2019-01-08) [2021-03-09].https://arxiv.org/pdf/1812.07210.pdf. 19 SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: a simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15: 1929-1958. 20 芦效峰,廖钰盈, LIO P, 等. 一种面向边缘计算的高效异步联邦学习机制[J]. 计算机研究与发展, 2020, 57(12): 2571-2582. 10.7544/issn1000-1239.2020.20190754 LU X F, LIAO Y Y, LIO P, et al. An asynchronous federated learning mechanism for edge network computing[J]. Journal of Computer Research and Development, 2020, 57(12): 2571-2582. 10.7544/issn1000-1239.2020.20190754 21 CHEN Y, SUN X, JIN Y. Communication?efficient federated deep learning with layerwise asynchronous model update and temporally weighted aggregation[J]. IEEE Transactions on Neural Networks and Learning Systems, 2019, 31(10): 4229-4238. 10.1109/tnnls.2019.2953131 22 LIAN X R, ZHANG C, ZHANG H, et al. Can decentralized algorithms outperform centralized algorithms? a case study for decentralized parallel stochastic gradient descent[C]// Proceedings of the 31st Annual Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2017: 5336-5346. 10.1109/yac.2016.7804900 23 GOODFELLOW I J, POUGET?ABADIE J, MIRZA M, et al. Generative adversarial nets[C]// Proceedings of the 28th International Conference on Neural Information Processing Systems. Cambridge: MIT Press, 2014, 2: 2672-2680. 24 KINGMA D P, WELLING M. Auto?encoding variational Bayes[EB/OL]. (2014-05-011) [2021-03-06].https://arxiv.org/pdf/1312.6114.pdf. 25 ARJOVSKY M, CHINTALA S, BOTTOU L. Wasserstein generative adversarial networks[C]// Proceedings of the 34th International Conference on Machine Learning. New York: JMLR.org, 2017: 214-223. 26 MIRZA M, OSINDERO S. Conditional generative adversarial nets[EB/OL]. (2014-11-06) [2021-03-09].https://arxiv.org/pdf/1411.1784.pdf. 27 YOSINSKI J, CLUNE J, BENGIO Y, et al. How transferable are features in deep neural networks?[C]// Proceedings of the 28th International Conference on Neural Information Processing Systems. Cambridge: MIT Press, 2014: 3320-3328. 28 HE K M, ZHANG X Y, REN S Q, et al. Deep residual learning for image recognition[C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2016: 770-778. 10.1109/cvpr.2016.90 29 ACAR D A E, ZHAO Y, NAVARRO R M, et al. Federated learning based on dynamic regularization[EB/OL]. (2021-11-09) [2021-12-13].https://arxiv.org/pdf/2111.04263.pdf.