2014 IEEE Conference on Computer Vision and Pattern Recognition 2014
DOI: 10.1109/cvpr.2014.253
|View full text |Cite
|
Sign up to set email alerts
|

Fast Supervised Hashing with Decision Trees for High-Dimensional Data

Abstract: Supervised hashing aims to map the original features to compact binary codes that are able to preserve label based similarity in the Hamming space. Non-linear hash functions have demonstrated their advantage over linear ones due to their powerful generalization capability. In the literature, kernel functions are typically used to achieve non-linearity in hashing, which achieve encouraging retrieval performance at the price of slow evaluation and training time.Here we propose to use boosted decision trees for a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
303
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 380 publications
(304 citation statements)
references
References 20 publications
1
303
0
Order By: Relevance
“…The basic idea is to exploit deep learning methods to extract deep features automatically, meanwhile use the similarity in functional behaviors as the supervised information to guide the deep feature learning process. Besides, we use learning to hash [Kong and Li, 2012] to further transform the real-valued representations to binary hash codes in the purpose of improving the detection efficiency and saving storage space. Specifically, we formulate the clone detection problem as a supervised deep feature learning problem via pairwise labels where clone pairs are regarded as positive examples and non-clone pairs are negative pairs.…”
Section: Introductionmentioning
confidence: 99%
“…The basic idea is to exploit deep learning methods to extract deep features automatically, meanwhile use the similarity in functional behaviors as the supervised information to guide the deep feature learning process. Besides, we use learning to hash [Kong and Li, 2012] to further transform the real-valued representations to binary hash codes in the purpose of improving the detection efficiency and saving storage space. Specifically, we formulate the clone detection problem as a supervised deep feature learning problem via pairwise labels where clone pairs are regarded as positive examples and non-clone pairs are negative pairs.…”
Section: Introductionmentioning
confidence: 99%
“…In [Lai et al, 2015], NINH is proposed to improve CNNH by fusing the hash learning into the deep structure. NINH learns hash codes by minimizing the triplet ranking loss based on the "network in network" network [Lin et al, 2013]. DSH [Liu et al, 2016] learns hash codes by preserving the similarity encoded in the input pairs of images (similar/dissimilar) based on a CNN architecture.…”
Section: Deep Hashing Methodsmentioning
confidence: 99%
“…DSH [Liu et al, 2016] learns hash codes based on a CNN architecture by preserving the similarity encoded in the input pairs of images (similar/dissimilar). NINH [Lai et al, 2015] learns hash codes by minimizing the triplet ranking loss based on the "network in network" network [Lin et al, 2013]. In [Dong et al, 2016], the low-rank hashing first pre-learns hash functions based on the CNN features and introduces the fine-tuning procedure based on the triplet ranking loss.…”
Section: Introductionmentioning
confidence: 99%
“…More specifically, unsupervised hashing does not utilize the label information of training examples [21][22][23][24]. On the contrary, supervised hashing methods try to incorporate semantic (label) information for hashing function learning [25][26][27][28][29][30]. Although hashing learning has been successfully applied in the natural image retrieval, few studies have been devoted to hashing learning-based RS image retrieval.…”
Section: Introductionmentioning
confidence: 99%