Aiming at the disadvantages of exponential growth of time required by KNN algorithm on data sets containing a large number of samples and low classification performance, this paper proposes a sample reduction method based on classification contribution ranking (SRCCR), which not only greatly reduces the storage demand and execution time, but also significantly improves the classification effect of KNN algorithm. Firstly, SRCCR denoises the initial training set, and expands the smooth decision boundary to a certain extent by removing the noise points in the data set; Secondly, the denoised samples are sorted in ascending order based on the classification contribution strategy. The lower the score, the closer the sample is to the class boundary, and the higher the score, the closer it is to the center of the same kind of samples; Finally, the selection process based on local set is used to select a certain representative boundary samples and a small number of central samples to form the final subset. Although the boundary samples contribute the most to the classification accuracy, appropriate retention of central samples can greatly improve the classification effect. In order to verify the effectiveness of the proposed method, we conduct comparative experiments on 24 real data sets from UCI and KEEL databases. Compared with several classical instance selection algorithms, the proposed SRCCR algorithm has certain advantages in accuracy and simplification rate. The results on the two-dimensional data set `Banana' show that SRCCR algorithm not only selects more representative boundary points and center points, but also retains the distribution of the original data set to a great extent.