Federated Learning (FL) is a distributed machine learning method that aims to jointly train a global model, but the global model is difficult to handle multi-data distribution situations. To deal with the multi-distribution challenge, clustered federated learning was introduced to optimize shared multiple models in a client grouping manner. Among them, server-side clustering was difficult to correct classification errors, while client-side clustering was crucial to the selection of the initial model. To solve these problems, an Automatically Adjusted Clustered Federated Learning (AACFL) framework was proposed, which used double-ended clustering to integrate server-side and client-side clustering. Firstly, double-ended clustering was used to divide client ends into adjustable clusters. Then, local client end identities were adjusted automatically. Finally, the correct client clusters were obtained. AACFL was evaluated on three classical federated datasets under non-independent and identically distributed conditions. Experimental results show that AACFL can obtain correct clusters through adjustment when there are errors in the double-ended clustering results. Compared with FedAvg (Federated Averaging) algorithm, CFL (Clustered Federated Learning), IFCA (Iterative Federated Clustering Algorithm) and other methods, AACFL can effectively improve the model convergence speed and the speed of obtaining correct clustering results, and has the accuracy improved by 0.20-23.16 percentage points on average with the number of clusters is 4 and the number of clients is 100. Therefore, the proposed framework can cluster efficiently and improve model convergence speed and accuracy.