2010
DOI: 10.3390/s101211259
|View full text |Cite
|
Sign up to set email alerts
|

Approximate Nearest Neighbor Search by Residual Vector Quantization

Abstract: A recently proposed product quantization method is efficient for large scale approximate nearest neighbor search, however, its performance on unstructured vectors is limited. This paper introduces residual vector quantization based approaches that are appropriate for unstructured vectors. Database vectors are quantized by residual vector quantizer. The reproductions are represented by short codes composed of their quantization indices. Euclidean distance between query vector and database vector is approximated… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
89
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 133 publications
(89 citation statements)
references
References 16 publications
0
89
0
Order By: Relevance
“…6) and GIST1M (Fig. 7) datasets against several methods presented in Section II: Product Quantization (ADC and IVFADC) [1], PQ-RO [19], PQ-RR [19], Cartesian k-means [15], OPQ-P [16], [51], OPQ-NP [16], [51], LOPQ [18], a non-exhaustive adaptation of OPQ [16], called I-OPQ [18], RVQ [52] , RVQ-P [19] and RVQ-NP [19].…”
Section: Results On Bigann: Sift1m Gist1mmentioning
confidence: 99%
See 1 more Smart Citation
“…6) and GIST1M (Fig. 7) datasets against several methods presented in Section II: Product Quantization (ADC and IVFADC) [1], PQ-RO [19], PQ-RR [19], Cartesian k-means [15], OPQ-P [16], [51], OPQ-NP [16], [51], LOPQ [18], a non-exhaustive adaptation of OPQ [16], called I-OPQ [18], RVQ [52] , RVQ-P [19] and RVQ-NP [19].…”
Section: Results On Bigann: Sift1m Gist1mmentioning
confidence: 99%
“…RVQ [52] approximates the quantization error by another quantizer instead of discarding it. In this method several stagequantizers, each one with its corresponding stage-codebook, are connected sequentially.…”
Section: Results On Bigann: Sift1m Gist1mmentioning
confidence: 99%
“…neural codes [5], or aggregation of low-or-mid-dimensional local descriptors such as SIFT [6], SURF [7], aKaZe [8], etc. using feature aggregation such as Fisher vectors [9], VLAD [10], residual vectors [11], triangulated embedding [12], etc., that produces a resulting vector of defined length D. The resulting high-dimensional descriptors are collected in a database consisting of N enrolled feature vectors and the identification system should produce a list L(q) of indices of enrolled features f (i) ∈ R D , 1 ≤ i ≤ N closest to the probe feature vector q ∈ R D . Additionally, local descriptors can be stored together with their coordinates within the image.…”
Section: B State-of-the-art In Mobile Visual Search and Recognitionmentioning
confidence: 99%
“…Quantization is usually performed with variants of kMeans clustering [25]. With larger vocabularies, however, quantization based on kMeans clustering becomes more and more costly, as assigning a value to its representant is linear in k. To reduce the high complexity of assigning feature vectors to cluster centers during query evaluation, a variety of non-optimal but fast quantization techniques such as Hierarchical kMeans [17,24], approximate kMeans [21], Product Quantization [11] and Residual Vector Quantization [5] have been proposed. After employing quantization for BoVW-based retrieval, it has also been investigated in the context of kNN search [11], which has since then received quite some attention.…”
Section: Related Workmentioning
confidence: 99%