Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Federated class-incremental learning method of label semantic embedding with multi-head self-attention
Hu WANG, Xiaofeng WANG, Ke LI, Yunjie MA
Journal of Computer Applications    2025, 45 (10): 3083-3090.   DOI: 10.11772/j.issn.1001-9081.2024101458
Abstract63)   HTML0)    PDF (1290KB)(40)       Save

Catastrophic forgetting poses a significant challenge to Federated Class-Incremental Learning (FCIL), leading to performance degradation of continuous tasks in FCIL. To address this issue, an FCIL method of Label Semantic Embedding (LSE) with Multi-Head Self-Attention (MHSA) — ATTLSE (ATTention Label Semantic Embedding) was proposed. Firstly, an LSE with MHSA was integrated with a generator. Secondly, during the stage of Data-Free Knowledge Distillation (DFKD), the generator with MHSA was used to produce more meaningful data samples, which guided the training of client models and reduced the influence of catastrophic forgetting problem in FCIL. Experiments were carried out on the CIFAR-100 and Tiny_ImageNet datasets. The results demonstrate that the average accuracy of ATTLSE is improved by 0.06 to 6.45 percentage points compared to LANDER (Label Text Centered Data-Free Knowledge Transfer) method, so as to solve the catastrophic forgetting problem to certain extent of continuous tasks in FCIL.

Table and Figures | Reference | Related Articles | Metrics