2022
DOI: 10.1016/j.ins.2021.11.006
|View full text |Cite
|
Sign up to set email alerts
|

VBLSH: Volume-balancing locality-sensitive hashing algorithm for K-nearest neighbors search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…Given a d-dimensional dataset D with n points and a query point q, the kNN problem aims to find the top-k nearest neighbors of q, which is a subset R(q) ⊆ D, s.t., |R(q)| = k and ∀x ∈ R(q), ∀y ∈ D − R(q), dis(x, q) ≤ dis(y, q). Various techniques have been proposed for efficient kNN search, such as tree-based [2,3,4,7,14,24,37], LSH-based [11,12,13,33,35,38,39], quantization-based [18,19,27] approaches. However, retrieving the nearest neighbors from a large-scale and high-dimensional dataset is computationally challenging due to the curse of dimensionality, i.e., existing approaches either suffer from high construction cost or unsatisfactory search performance.…”
Section: Introductionmentioning
confidence: 99%
“…Given a d-dimensional dataset D with n points and a query point q, the kNN problem aims to find the top-k nearest neighbors of q, which is a subset R(q) ⊆ D, s.t., |R(q)| = k and ∀x ∈ R(q), ∀y ∈ D − R(q), dis(x, q) ≤ dis(y, q). Various techniques have been proposed for efficient kNN search, such as tree-based [2,3,4,7,14,24,37], LSH-based [11,12,13,33,35,38,39], quantization-based [18,19,27] approaches. However, retrieving the nearest neighbors from a large-scale and high-dimensional dataset is computationally challenging due to the curse of dimensionality, i.e., existing approaches either suffer from high construction cost or unsatisfactory search performance.…”
Section: Introductionmentioning
confidence: 99%