In this work, we present an approach based on multilevel local as well as global Convolutional Neural Network (CNN) feature matching to retrieve near duplicate images. CNN features are suitable for visual matching. The CNN features of entire image may not give accuracy in retrieval due to various image editing/capturing operations. Our retrieval task focuses on matching image pairs based on local and global levels. In local matching, an image is segmented into fixed size blocks followed by extracting patches by considering neighboring regions at different levels. Matching local image patches at different levels provides robustness to our retrieval model. In local patch extraction, we select blocks containing SURF feature points instead of selecting all blocks. CNN features are extracted and stored for each image patch and then followed by extraction of global CNN features. Finally, similarity between image pairs is computed by considering all extracted CNN features. Our similarity function is based on correlation and number of blocks found in matching. We implemented our proposed approach on benchmarking Holiday dataset. Retrieval results show remarkable improvement in mean average precision (mAP) on the dataset.