Journal of Computer Applications
Next Articles
Received:
Revised:
Accepted:
Online:
Published:
Supported by:
张淑芬1,2,3,汤本建1,2,3*,田子坤 1,2,3,秦肖阳 1,2,3
通讯作者:
基金资助:
Abstract: With the rapid development of artificial intelligence, the risk of user privacy disclosure is becoming increasingly serious. Differential privacy is a key privacy protection technology, which prevents personal information leakage by introducing noise into data, while federated learning allows joint training of models without exchanging data to protect data security. In recent years, differential privacy technology and federated learning have been used together to give full play to their respective advantages: differential privacy ensures privacy protection in the process of data use, while federated learning improves the generalization ability and efficiency of the model through distributed training. Aiming at the privacy security problem of federated learning, the latest research progress of federated learning based on differential privacy was systematically summarized and compared, including different differential privacy mechanisms, federated learning algorithms and application scenarios. Special attention was paid to the application of differential privacy in federated learning, including data aggregation, gradient decline and model training, and the advantages and disadvantages of various technologies were analyzed. Finally, the current challenges and development directions are summarized in detail.
Key words: federal learning, differential privacy, data aggregation, gradient descent, model training
摘要: 随着人工智能的快速发展,用户隐私泄露风险日益严重。差分隐私是一种关键的隐私保护技术,通过在数据中引入噪声来防止个人信息泄露,而联邦学习则允许在不交换数据的情况下共同训练模型,保护数据的安全性。近年来,差分隐私技术与联邦学习结合使用,充分发挥它们各自的优势:差分隐私确保数据使用过程中的隐私保护,而联邦学习则通过分布式训练提高模型的泛化能力和效率。针对联邦学习的隐私安全问题,对基于差分隐私的联邦学习的最新研究进展进行系统总结和比较,包括不同的差分隐私机制、联邦学习算法和应用场景,重点讨论了差分隐私在联邦学习中的应用方式,包括数据聚合、梯度下降和模型训练等方面,并对各种技术的优缺点进行分析。最后对当前存在的挑战和发展方向进行了详细总结。
关键词: 联邦学习, 差分隐私, 数据聚合, 梯度下降, 模型训练
CLC Number:
TP309
TP18
张淑芬 汤本建 田子坤 秦肖阳. 基于差分隐私的联邦学习研究综述[J]. 《计算机应用》唯一官方网站, DOI: 10.11772/j.issn.1001-9081.2024101505.
0 / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: https://www.joca.cn/EN/10.11772/j.issn.1001-9081.2024101505