2020
DOI: 10.48550/arxiv.2008.03030
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Robust Clustering by Contrastive Learning

Abstract: Recently, many unsupervised deep learning methods have been proposed to learn clustering with unlabelled data. By introducing data augmentation, most of the latest methods look into deep clustering from the perspective that the original image and its transformation should share similar semantic clustering assignment. However, the representation features could be quite different even they are assigned to the same cluster since softmax function is only sensitive to the maximum value. This may result in high intr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(22 citation statements)
references
References 21 publications
0
22
0
Order By: Relevance
“…According the difference in self-supervised signal, deep clustering methods can be mainly divided into two categories, including the reconstruction based methods [37,28,8,11,38] and the self-augmentation based methods [3,35,17,12,16,33,42].…”
Section: Related Work 21 Deep Clusteringmentioning
confidence: 99%
See 3 more Smart Citations
“…According the difference in self-supervised signal, deep clustering methods can be mainly divided into two categories, including the reconstruction based methods [37,28,8,11,38] and the self-augmentation based methods [3,35,17,12,16,33,42].…”
Section: Related Work 21 Deep Clusteringmentioning
confidence: 99%
“…PICA [16] learns the most semantically plausible clustering solution by maximizing partition confidence. DRC [42] tries to learn invariant features and clusters by introducing contrastive learning to optimize the consistency between image and its augmentation simultaneously. SCAN [33] utilizes a three-stage method to improve the clustering.…”
Section: Related Work 21 Deep Clusteringmentioning
confidence: 99%
See 2 more Smart Citations
“…Several loss functions have been proposed, such as the triplet loss [Chopra et al, 2005], the noise contrastive estimation (NCE) loss [Gutmann and Hyvärinen, 2010], the normalized temperature-scaled cross entropy loss (NT-Xent) . Deep robust clustering turns maximizing mutual information into minimizing contrastive loss and achieves significant improvement after applying contrastive learning to decrease intra-class variance [Zhong et al, 2020]. Contrastive clustering develops a dual contrastive learning framework, which conducts contrastive learning at instance-level as well as cluster-level [Li et al, 2021].…”
Section: Contrastive Clusteringmentioning
confidence: 99%