Catastrophic forgetting poses a significant challenge to Federated Class-Incremental Learning (FCIL), leading to performance degradation of continuous tasks in FCIL. To address this issue, an FCIL method of Label Semantic Embedding (LSE) with Multi-Head Self-Attention (MHSA) — ATTLSE (ATTention Label Semantic Embedding) was proposed. Firstly, an LSE with MHSA was integrated with a generator. Secondly, during the stage of Data-Free Knowledge Distillation (DFKD), the generator with MHSA was used to produce more meaningful data samples, which guided the training of client models and reduced the influence of catastrophic forgetting problem in FCIL. Experiments were carried out on the CIFAR-100 and Tiny_ImageNet datasets. The results demonstrate that the average accuracy of ATTLSE is improved by 0.06 to 6.45 percentage points compared to LANDER (Label Text Centered Data-Free Knowledge Transfer) method, so as to solve the catastrophic forgetting problem to certain extent of continuous tasks in FCIL.