In order to control the effects of outliers in training data and get sparse results, Huang et al. [15] proposed the ramp loss linear programming support vector machine (ramp-LPSVM). This combination of l 1 regularization and ramp loss does not only lead to the sparsity of parameters in decision functions, but also limits the effects of outliers with a maximal penalty. However, due to its non-convexity, the computational cost to achieve a satisfying solution is often expensive. In this paper, we propose a modified coordinate descent algorithm, which deals with a series of one-variable piecewise linear subproblems. Considering that the obtained subproblems are DC programming problems, we linearize the concave part of the objective functions and solve the obtained convex problems. To test the performances of the proposed algorithm, numerical experiments have been carried out and analysed on benchmark data sets. To enhance the sparsity and robustness, the experiments are initialized from C-SVM solutions. The results confirm its excellent performances in classification accuracy, robustness and efficiency in computation.