Proceedings of the 26th ACM International Conference on Multimedia 2018
DOI: 10.1145/3240508.3240519
|View full text |Cite
|
Sign up to set email alerts
|

Supervised Online Hashing via Hadamard Codebook Learning

Abstract: Online hashing has attracted extensive research attention when facing streaming data. Most online hashing methods, learning binary codes based on pairwise similarities of training instances, fail to capture the semantic relationship, and suffer from a poor generalization in largescale applications due to large variations. In this paper, we propose to model the similarity distributions between the input data and the hashing codes, upon which a novel supervised online hashing method, dubbed as Similarity Distrib… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
56
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 62 publications
(57 citation statements)
references
References 36 publications
1
56
0
Order By: Relevance
“…We randomly select 5,900 samples from each category as the training set; the remaining images are set as the testing set. From the training set, 20,000 instances are utilized for learning hashing functions [31]. Twenty example images from each category of CIFAR-10 are shown in Figure 2.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We randomly select 5,900 samples from each category as the training set; the remaining images are set as the testing set. From the training set, 20,000 instances are utilized for learning hashing functions [31]. Twenty example images from each category of CIFAR-10 are shown in Figure 2.…”
Section: Methodsmentioning
confidence: 99%
“…Supervised methods obtain better performance than unsupervised methods in most instances because of the utilization of label information. Some representative works include online hashing (OKH) [24,25], adaptive hashing (AdaptHash) [26], online supervised hashing (OSH) [27,28], online hashing with mutual information (MIHash) [29], balanced similarity for online discrete hashing (BSODH) [30], and Hadamard codebookbased online hashing (HCOH) [31].…”
Section: Introductionmentioning
confidence: 99%
“…We compare our method, OSSH, with the following online hashing methods: Online Sketching Hashing (OSH) [30], Online Kernel Hashing (OKH) [28], Adaptive Hashing (AdaptH) [27], Online Supervised Hashing (OSupH) [29], Mutual Information Hashing (MIH) [32] with the trigger update module, Hadamard Codebook based Online Hashing (HCOH) [24]. All the methods are provided by the authors.…”
Section: B Comparison In a Static Environmentmentioning
confidence: 99%
“…Inspired by the online learning methods, recently, online hashing methods [24]- [27] are proposed to learn the hash functions in the online settings by processing the data in a sequential order with one pass. According to the ways of learning the hash functions from the streaming data, the online hashing methods can be roughly categorized as the stochastic gradient decent (SGD)-based online hashing methods [27]- [29] and the data sketching-based online hashing methods [30], [31].…”
Section: Introductionmentioning
confidence: 99%
“…To address the efficiency and effectiveness issues, hashing methods have become a hot research topic. A great number of hashing methods are proposed to map images into a hamming space, including traditional hashing methods (Andoni and Indyk 2006;Lin et al 2018;2019) and deep hashing methods (Cao et al 2017;2018b;Liu et al 2018;Sheng et al 2018). Compared with traditional ones, deep hashing methods usually achieve better re- Figure 1: To obtain the optimal boundary for points with similar hashing codes, we propose a novel self-paced deep adversarial hashing to generate hard samples, as shown in (b).…”
Section: Introductionmentioning
confidence: 99%