Automatic classification of cervical Pap smear images plays a key role in computer-aided cervical cancer diagnosis. Conventional classification approaches rely on cell segmentation and feature extraction methods. Due to overlapping cells, dust, impurities and uneven irradiation, the accurate segmentation and feature extraction of Pap smear images are still challenging. To overcome the difficulties of the feature-based approaches, deep learning is becoming more important alternative. Since the number of cervical cytological images is limited, an adaptive pruning deep transfer learning model (PsiNet-TAP) is proposed for Pap smear images classification. We designed a novel network to classify Pap smear images. Due to the limited number of images, we adopted transfer learning to obtain the pre-trained model. Then it was optimized by modifying the convolution layer and pruning some convolution kernels that may interfere with the target classification task. The proposed method PsiNet-TAP was tested on 389 cervical Pap smear images. The method has achieved remarkable performance (accuracy: more than 98%), which demonstrates the strength of the proposed method for providing an efficient tool for cervical cancer classification in clinical settings. INDEX TERMS Adaptive pruning, cervical smear images, convolutional neural networks, transfer learning, uninvolved images.