2018
DOI: 10.1109/tpami.2017.2666812
|View full text |Cite
|
Sign up to set email alerts
|

Supervised Learning of Semantics-Preserving Hash via Deep Convolutional Neural Networks

Abstract: This paper presents a simple yet effective supervised deep hash approach that constructs binary hash codes from labeled data for large-scale image search. We assume that the semantic labels are governed by several latent attributes with each attribute on or off, and classification relies on these attributes. Based on this assumption, our approach, dubbed supervised semantics-preserving deep hashing (SSDH), constructs hash functions as a latent layer in a deep network and the binary codes are learned by minimiz… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
201
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 283 publications
(201 citation statements)
references
References 46 publications
0
201
0
Order By: Relevance
“…Through optimising the sum of reconstruction loss, independence constraint loss, and balance constraint loss, BDNN can get a little improvement with respect to DH. The recent state-of-art method is Supervised Semantic-preserving Deep Hashing (SSDH) [26], which assumes that the semantic labels are governed by several latent attributes. Based on this assumption, SSDH constructs hash function as a latent layer in a deep network and learns the binary codes by minimising an objective function combined by classification error, independent property, and balance property like those in method BDNN.…”
Section: A Deep Hashing Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Through optimising the sum of reconstruction loss, independence constraint loss, and balance constraint loss, BDNN can get a little improvement with respect to DH. The recent state-of-art method is Supervised Semantic-preserving Deep Hashing (SSDH) [26], which assumes that the semantic labels are governed by several latent attributes. Based on this assumption, SSDH constructs hash function as a latent layer in a deep network and learns the binary codes by minimising an objective function combined by classification error, independent property, and balance property like those in method BDNN.…”
Section: A Deep Hashing Methodsmentioning
confidence: 99%
“…To solve the problem of NP-hard optimisation and computational complexity in graph based method, Asymmetric Discrete Graph Hashing (ADGH) [49] was proposed by preserving the asymmetric discrete constraint and building an asymmetric affinity matrix to learn compact binary codes. Additionally, SSDH [26] constructs hash functions as a latent layer in a deep network and the binary codes are learned by minimising an objective function defined over classification error.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to push the final real activations towards the extremities of the sigmoid range, and inspired by [9], we use a second loss whose goal is to maximize the sum of the squared errors between the output-layer activations and the value 0.5:…”
Section: Metric-learning Based Deep Hashing Networkmentioning
confidence: 99%
“…Supervised Discrete Hashing (SDH) [18] improves retrieval accuracy by integrating classify and binary codes generation during training. Recently, many deep hashing methods are proposed [3,6,10,29,[32][33][34]. According to the forms of similarity preserving manners, two supervised information are widely used: 1) the pairwise-based methods and 2) the triplet-based methods.…”
Section: Related Workmentioning
confidence: 99%