Lightweight service identification models are very important for resource-constrained distribution grid systems. To address the increasingly larger deep learning models, we provide a method for the lightweight identification of complex power services based on knowledge distillation and network pruning. Specifically, a pruning method based on Taylor expansion is first used to rank the importance of the parameters of the small-scale network and delete some of the parameters, compressing the model parameters and reducing the amount of operation and complexity. Then, knowledge distillation is used to migrate the knowledge from the large-scale network ResNet50 to the small-scale network so that the small-scale network can fit the soft-label information output from the large-scale neural network through the loss function to complete the knowledge migration of the large-scale neural network. Experimental results show that this method can compress the model size of the small network and improve the recognition accuracy. Compared with the original small network, the model accuracy is improved by 2.24 percentage points to 97.24%. The number of model parameters is compressed by 81.9% and the number of floating-point operations is compressed by 92.1%, making it more suitable for deployment in resource-constrained devices.