2016
DOI: 10.1109/tcyb.2015.2474742
|View full text |Cite
|
Sign up to set email alerts
|

Structure Sensitive Hashing With Adaptive Product Quantization

Abstract: Hashing has been proved as an attractive solution to approximate nearest neighbor search, owing to its theoretical guarantee and computational efficiency. Though most of prior hashing algorithms can achieve low memory and computation consumption by pursuing compact hash codes, however, they are still far beyond the capability of learning discriminative hash functions from the data with complex inherent structure among them. To address this issue, in this paper, we propose a structure sensitive hashing based on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
19
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 69 publications
(19 citation statements)
references
References 44 publications
0
19
0
Order By: Relevance
“…Recently, the amount of literatures have grown up considerably around the theme of hashing [12,13,32,45,45,46]. According to whether supervised information are involved in the learning phase, existing hashing models can be divided into two categories: supervised hashing methods and unsupervised hashing methods.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, the amount of literatures have grown up considerably around the theme of hashing [12,13,32,45,45,46]. According to whether supervised information are involved in the learning phase, existing hashing models can be divided into two categories: supervised hashing methods and unsupervised hashing methods.…”
Section: Related Workmentioning
confidence: 99%
“…Unsupervised hashing methods aim to preserve the linkages among the unlabeled training data points. Typical examples include graph based hashing [10,36,37], minimize quantization error [38], and minimize reconstruction error [9,11,[33][34][35]. Supervised methods utilize the semantic labels or relevance information to improve the quality of hash codes.…”
Section: Related Workmentioning
confidence: 99%
“…Quantization that requires codewords maintenance: AQ, CQ, SQ, TQ, KMH, ABQ, SSH ( [21], [22], [23], [24], [25], [26], [27], [28] into a Cartesian product of subspaces. The codeword of a data instance is represented by the concatenation of the subcodeword of the data in all subspaces.…”
Section: Related Workmentioning
confidence: 99%