2022
DOI: 10.1155/2022/4364252
|View full text |Cite
|
Sign up to set email alerts
|

Accumulative Quantization for Approximate Nearest Neighbor Search

Abstract: To further improve the approximate nearest neighbor (ANN) search performance, an accumulative quantization (AQ) is proposed and applied to effective ANN search. It approximates a vector with the accumulation of several centroids, each of which is selected from a different codebook. To provide accurate approximation for an input vector, an iterative optimization is designed when training codebooks for improving their approximation power. Besides, another optimization is introduced into offline vector quantizati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…In our former work [34], after decomposing vectors into partial-vectors of the same dimension, each codebook was trained on partial-vectors. Then, we used accumulative quantization (AQ) to approximate a vector according to the sum of its partial vectors, which we quantized with the corresponding codebook.…”
Section: Accumulative Quantizationmentioning
confidence: 99%
See 4 more Smart Citations
“…In our former work [34], after decomposing vectors into partial-vectors of the same dimension, each codebook was trained on partial-vectors. Then, we used accumulative quantization (AQ) to approximate a vector according to the sum of its partial vectors, which we quantized with the corresponding codebook.…”
Section: Accumulative Quantizationmentioning
confidence: 99%
“…Figure 5 shows the comparison of training error produced by training codebooks between E-AQ and AQ [34] under different parameters on SIFT-1M and GIST-1M. The training error was measured by the MSE, computed according to Formula ( 5).…”
Section: Comparison Of Training Error and Quantization Errormentioning
confidence: 99%
See 3 more Smart Citations