Big data analysis based on artificial intelligence is particularly important in the era of Internet. The data is stored in different regions in industry. Meanwhile, sending data to servers generates huge amount of communication cost for centralized training. The distributed machine learning can resolve the storage of data and decrease the cost of data communication. But different distributed machine learning frameworks are also limited with the problems of low algorithm compatibility and poor expandability. The aim of this paper is building the distributed machine learning framework based on Ps-Lite and implementing algorithms in the framework. The framework is realized with asynchronous communication and computation methods. The algorithm implementation includes gradient-aggregating algorithm (distributed Stochastic Gradient Descent) and three regression algorithms (Logistic Regression, Lasso Regression and Ridge Regression). The algorithm implementation illustrates that common algorithms fit this framework with high compatibility and strong expandability. Finally, the experiment of Logistic Regression implementation proves the performance of the framework. The computation time of unit node is saved 50% with the increase of node number. The accuracy of the training model is maintained above 70% in the framework. The convergence efficiency of Logistic Regression is 3 times higher than that of the traditional one in the multiple-node framework.