Approximate nearest neighbor (ANN) search is fundamental for fast content-based image retrieval. While vector quantization is one key to performing an effective ANN search, in order to further improve ANN search accuracy, we propose an enhanced accumulative quantization (E-AQ). Based on our former work, we introduced the idea of the quarter point into accumulative quantization (AQ). Instead of finding the nearest centroid, the quarter vector was used to quantize the vector and was computed for each vector according to its nearest centroid and second nearest centroid. Then, the error produced through codebook training and vector quantization was reduced without increasing the number of centroids in each codebook. To evaluate the accuracy to which vectors were approximated by their quantization outputs, we realized an E-AQ-based exhaustive method for ANN search. Experimental results show that our approach gained up to 0.996 and 0.776 Recall@100 with eight size 256 codebooks on SIFT and GIST datasets, respectively, which is at least 1.6% and 4.9% higher than six other state-of-the-art methods. Moreover, based on the experimental results, E-AQ needs fewer codebooks while still providing the same ANN search accuracy.