2020
DOI: 10.1609/aaai.v34i07.6773
|View full text |Cite
|
Sign up to set email alerts
|

SSAH: Semi-Supervised Adversarial Deep Hashing with Self-Paced Hard Sample Generation

Abstract: Deep hashing methods have been proved to be effective and efficient for large-scale Web media search. The success of these data-driven methods largely depends on collecting sufficient labeled data, which is usually a crucial limitation in practical cases. The current solutions to this issue utilize Generative Adversarial Network (GAN) to augment data in semi-supervised learning. However, existing GAN-based methods treat image generations and hashing learning as two isolated processes, leading to generation ine… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 29 publications
(6 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…It uses labeled data for empirical error minimization and both labeled and unlabeled data for embedding error minimization. The generative adversarial learning approach was also utilized in semi-supervised deep image retrieval [60,142]. A teacher-student semisupervised image retrieval method was presented in [172], where the pairwise information learned by the teacher network is used as the guidance to train the student network.…”
Section: B) Other Forms Of Supervisionmentioning
confidence: 99%
“…It uses labeled data for empirical error minimization and both labeled and unlabeled data for embedding error minimization. The generative adversarial learning approach was also utilized in semi-supervised deep image retrieval [60,142]. A teacher-student semisupervised image retrieval method was presented in [172], where the pairwise information learned by the teacher network is used as the guidance to train the student network.…”
Section: B) Other Forms Of Supervisionmentioning
confidence: 99%
“…In labelinsufficient scenarios, deep hashing is designed for exploiting unlabeled or weakly labeled data, e.g. semi-supervised hashing (Yan, Zhang, and Li 2017;Jin et al 2020), unsupervised hashing (Shen et al 2018;Yang et al 2019), and weakly-supervised hashing (Li et al 2020;Gattupalli, Zhuo, and Li 2019). Moreover, building upon the merit of deep learning, hashing technique has also been applied in more challenging tasks, such as video retrieval (Gu, Ma, and Yang 2016) and cross-modal retrieval (Jiang and Li 2017).…”
Section: Deep Hashing Based Similarity Retrievalmentioning
confidence: 99%
“…A recent milestone of learning to hash is the integration of hashing and deep learning. There have been many breakthroughs, both on the deep binary hashing [8,9] and deep quantization [10,11,12,13]. Despite the excitement to witness the harvest of deep hashing models that keep refreshing state of the art, we find that most of current models still relied on high-quality supervision.…”
Section: Introductionmentioning
confidence: 98%