Classic Federated Learning (FL) algorithms are difficult to achieve good results in scenarios where data is highly heterogeneous. In Personalized FL (PFL), a new solution was proposed aiming at the problem of data heterogeneity in federated learning, which is to “tailor” a dedicated model for each client. In this way, the models had good performance. However, it brought the difficulty in extending federated learning to new clients at the same time. Focusing on the challenges of performance and scalability in PFL, FedDual, a FL model with dual stream neural network structure, was proposed. By adding an encoder for analyzing the personalized characteristics of clients, this model was not only able to have the performance of personalized models, but also able to be extended to new clients easily. Experimental results show that compared to the classic Federated Averaging (FedAvg) algorithm on datasets such as MNIST and FashionMNIST, FedDual obviously improves the accuracy; on CIFAR10 dataset, FedDual improves the accuracy by more than 10 percentage points, FedDual achieves “plug and play” for new clients without decrease of the accuracy, solving the problem of difficult scalability for new clients.