Deep learning technology has been utilized in computed tomography, but, it needs centralized dataset to train the neural networks. To solve it, federated learning has been proposed, which collaborates the data from different local medical institutions with privacy-preserving decentralized strategy. However, lots of unpaired data is not included in the local models training and directly aggregating the parameters would degrade the performance of the updated global model. In order to deal with the issues, we present a semi-supervised and semi-centralized federated learning method to promote the performance of the learned global model. Specifically, each local model is trained with an unsupervised strategy locally at a fixed round. After that, the parameters of the local models are shared to aggregate on the server to update the global model. Then, the global model is further trained with a standard dataset, which contains well paired training samples to stabilize and standardize the global model. Finally, the global model is distributed to local models for the next training step. For shorten, we call the presented federated learning method as “3SC-FL”. Experiments demonstrate the presented 3SC-FL outperforms the compared methods, qualitatively and quantitatively.