Elastic net model is widely used in high-dimensional statistics for parameter regression and variable selection, which has been proved that the performance is often better than the lasso. However, it can only deals with data containing Gaussian noise, so it is not suitable for modern complex highdimensional data. Fortunately, an adaptive and robust minimization model, which combines the p -norm data fidelity and elastic net regularization, has been proposed to deal with different types of noises and inherit the advantages of the elastic net in prediction accuracy. The double non-smoothness in objective function makes it challenging to minimize the model. After investigation, we find that the optimization algorithm is currently limited to the first-order alternating direction method of multipliers (ADMM), which is relatively lower in the recovered solutions' accuracy and relatively slower in the calculation speed. Therefore, we are committed to developing a fast and effective algorithm based on second-order information. Specifically, we propose a preconditioned proximal point algorithm (abbreviated as P-PPA) to solve the considered model by adding a proximal term. In theory, we analyze the consistency between the solution of the surrogate model and the original model. In addition, a key subproblem in P-PPA is solved by superlinear or even quadratically convergent semismooth Newton methods from the dual perspective. Finally, a large number of numerical experiments on high-dimensional simulated and real examples fully verify that our proposed algorithm is superior to ADMM in terms of calculation accuracy and speed.Keywords. Alternating direction method of multipliers; High-dimensional sparse linear regression; p 1 -2 minimization; Preconditioned proximal point algorithm; Semismooth Newton method.