In smart applications such as smart medical equipment, more data needs to be processed and trained locally and near the local end to prevent privacy leaks. However, the storage and computing capabilities of smart devices are limited, so some computing tasks need to be outsourced; concurrently, the prevention of malicious nodes from accessing user data during outsourcing computing is required. Therefore, this paper proposes EVPP (efficient, verifiable, and privacy-preserving), which is a computing outsourcing scheme used in the training process of machine learning models. The edge nodes outsource the complex computing process to the edge service node. First, we conducted a certain amount of testing to confirm the parts that need to be outsourced. In this solution, the computationally intensive part of the model training process is outsourced. Meanwhile, a random encryption perturbation is performed on the outsourced training matrix, and verification factors are introduced to ensure the verifiability of the results. In addition, the system can generate verifiable evidence that can be generated to build a trust mechanism when a malicious service node is found. At the same time, this paper also discusses the application of the scheme in other algorithms in order to be better applied. Through the analysis of theoretical and experimental data, it can be shown that the scheme proposed in this paper can effectively use the computing power of the equipment.