2020
DOI: 10.1109/access.2020.3022672
|View full text |Cite
|
Sign up to set email alerts
|

Contrastive Self-Supervised Hashing With Dual Pseudo Agreement

Abstract: Recently, unsupervised deep hashing has attracted increasing attention, mainly because of its potential ability to learn binary codes without identity annotations. However, because the labels are predicted by their pretext tasks, unsupervised deep hashing becomes unstable when learning with noisy labels. To mitigate this issue, we propose a simple but effective approach to self-supervised hash learning based on dual pseudo agreement. By adding a consistency constraint, our method can prevent corrupted labels a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 45 publications
0
4
0
Order By: Relevance
“…The NRDH method is compared with several popular hash learning algorithms: FPH [16], HashNet [18], NSPH [26], CSH [27], DOH [28], and RODH [29]. Table 1 compares the MAP of our DNRH algorithm and existing hash learning algorithms on the CIFAR-10 dataset.…”
Section: B Experimental Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The NRDH method is compared with several popular hash learning algorithms: FPH [16], HashNet [18], NSPH [26], CSH [27], DOH [28], and RODH [29]. Table 1 compares the MAP of our DNRH algorithm and existing hash learning algorithms on the CIFAR-10 dataset.…”
Section: B Experimental Resultsmentioning
confidence: 99%
“…Quantization loss reduces the error caused by binarizing realvalued feature representations to hash codes, and bit balance loss weakens hash code bias. CSH [27] proposed a simple but effective approach to self-supervised hash learning based on dual pseudo-agreement. By adding a consistency constraint, this method can prevent corrupted labels and encourage generalization for effective knowledge distillation.…”
Section: Introductionmentioning
confidence: 99%
“…However, these models did not make full use of the pair information. While contrastive learning [8,16,31] aims to leverage pair information, the success of contrastive hashing in tasks like image retrieval [11,15,19] inspired numerous researchers to investigate its application in cross-modal hashing.…”
Section: Related Workmentioning
confidence: 99%
“…With the rise of big data, the use and rapid development of big data in the field of deep learning and computer vision has been promoted, and the deep learning and computer vision based on big data can complete machine training more effectively and accurately. Deep learning and computer vision and big data analysis are techniques that use existing computer system models to convert a large amount of data or big data, acquired by computers into useful information [1][2][3]. The larger the scale of the data used, the better the training effect of deep learning and computer vision, the more accurate, the more content recognition, and the reduction of overfitting and underfitting phenomena.…”
Section: Introductionmentioning
confidence: 99%