Journal of Computer Applications ›› 2025, Vol. 45 ›› Issue (10): 3083-3090.DOI: 10.11772/j.issn.1001-9081.2024101458

• Artificial intelligence • Previous Articles    

Federated class-incremental learning method of label semantic embedding with multi-head self-attention

Hu WANG1, Xiaofeng WANG1,2(), Ke LI1, Yunjie MA3   

  1. 1.School of Computer Science and Engineering,North Minzu University,Yinchuan Ningxia 750021,China
    2.The Key Laboratory of Images and Graphics Intelligent Processing of State Ethnic Affairs Commission (North Minzu University),Yinchuan Ningxia 750021,China
    3.School of Mathematics and Information Science,North Minzu University,Yinchuan Ningxia 750021,China
  • Received:2024-10-16 Revised:2024-12-20 Accepted:2024-12-20 Online:2024-12-25 Published:2025-10-10
  • Contact: Xiaofeng WANG
  • About author:WANG Hu, born in 1998, M. S. candidate. His research interests include federated learning, knowledge distillation.
    WANG Xiaofeng, born in 1980, Ph. D., associate professor. His research interests include algorithm design and analysis, artificial intelligence.
    LI Ke, born in 2000, M. S. candidate. His research interests include federated learning, multi-task learning.
    MA Yunjie, born in 1998, M. S. candidate. Her research interests include three-way decision, fuzzy set.
  • Supported by:
    Ningxia Natural Science Foundation(2024AAC03165);Ningxia Youth Top Talent Project(2021)

融合多头自注意力的标签语义嵌入联邦类增量学习方法

王虎1, 王晓峰1,2(), 李可1, 马云洁3   

  1. 1.北方民族大学 计算机科学与工程学院,银川 750021
    2.图形图像智能处理国家民委重点实验室(北方民族大学),银川 750021
    3.北方民族大学 数学与信息科学学院,银川 750021
  • 通讯作者: 王晓峰
  • 作者简介:王虎(1998—),男,江苏南京人,硕士研究生,CCF会员,主要研究方向:联邦学习、知识蒸馏
    王晓峰(1980—),男(回族),甘肃会宁人,副教授,博士,CCF会员,主要研究方向:算法设计与分析、人工智能 Email:xfwang@nmu.edu.cn
    李可(2000—),男,河南开封人,硕士研究生,CCF会员,主要研究方向:联邦学习、多任务学习
    马云洁(1998—),女,山西晋城人,硕士研究生,CCF会员,主要研究方向:三支决策、模糊集。
  • 基金资助:
    宁夏自然科学基金资助项目(2024AAC03165);宁夏自然科学基金资助项目(2024AAC03169);宁夏青年拔尖人才项目(2021)

Abstract:

Catastrophic forgetting poses a significant challenge to Federated Class-Incremental Learning (FCIL), leading to performance degradation of continuous tasks in FCIL. To address this issue, an FCIL method of Label Semantic Embedding (LSE) with Multi-Head Self-Attention (MHSA) — ATTLSE (ATTention Label Semantic Embedding) was proposed. Firstly, an LSE with MHSA was integrated with a generator. Secondly, during the stage of Data-Free Knowledge Distillation (DFKD), the generator with MHSA was used to produce more meaningful data samples, which guided the training of client models and reduced the influence of catastrophic forgetting problem in FCIL. Experiments were carried out on the CIFAR-100 and Tiny_ImageNet datasets. The results demonstrate that the average accuracy of ATTLSE is improved by 0.06 to 6.45 percentage points compared to LANDER (Label Text Centered Data-Free Knowledge Transfer) method, so as to solve the catastrophic forgetting problem to certain extent of continuous tasks in FCIL.

Key words: catastrophic forgetting, Federated Class-Incremental Learning (FCIL), Multi-Head Self-Attention (MHSA), Label Semantic Embedding (LSE), Data-Free Knowledge Distillation (DFKD)

摘要:

灾难性遗忘对联邦类增量学习(FCIL)构成了显著挑战,导致进行FCIL持续任务时性能下降的问题。针对此问题,提出一种融合多头自注意力(MHSA)的标签语义嵌入(LSE)的FCIL方法——ATTLSE(ATTention Label Semantic Embedding)。首先,融合MHSA的LSE和生成器;其次,在无数据知识蒸馏(DFKD)阶段,依靠融合MHSA的生成器生成更多有意义的数据样本,以指导用户端模型的训练,并降低灾难性遗忘问题在FCIL中的影响。实验结果表明,在CIFAR-100和Tiny_ImageNet数据集上,与LANDER(Label Text Centered Data-Free Knowledge Transfer)方法相比,ATTLSE的平均准确率提升了0.06~6.45个百分点,缓解了持续任务在联邦类增量学习上的灾难性遗忘问题。

关键词: 灾难性遗忘, 联邦类增量学习, 多头自注意力, 标签语义嵌入, 无数据知识蒸馏

CLC Number: