2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00824
|View full text |Cite
|
Sign up to set email alerts
|

Deep Comprehensive Correlation Mining for Image Clustering

Abstract: Recent developed deep unsupervised methods allow us to jointly learn representation and cluster unlabelled data. These deep clustering methods mainly focus on the correlation among samples, e.g., selecting high precision pairs to gradually tune the feature representation, which neglects other useful correlations. In this paper, we propose a novel clustering framework, named deep comprehensive correlation mining (DCCM), for exploring and taking full advantage of various kinds of correlations behind the unlabele… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
140
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 160 publications
(140 citation statements)
references
References 36 publications
0
140
0
Order By: Relevance
“…(1) CIFAR-10(/100) [28]: A natural image dataset with 50,000/10,000 samples from 10(/100) classes for training and testing respectively. We adopted the clustering setup same as [24,44,7]: Using both the training and test sets (without labels) for CIFAR10/100 and STL-10, and only the training set for ImageNet-10, ImageNet-Dogs and Tiny-ImageNet; Taking the 20 super-classes of CIFAR-100 as the ground-truth.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…(1) CIFAR-10(/100) [28]: A natural image dataset with 50,000/10,000 samples from 10(/100) classes for training and testing respectively. We adopted the clustering setup same as [24,44,7]: Using both the training and test sets (without labels) for CIFAR10/100 and STL-10, and only the training set for ImageNet-10, ImageNet-Dogs and Tiny-ImageNet; Taking the 20 super-classes of CIFAR-100 as the ground-truth.…”
Section: Methodsmentioning
confidence: 99%
“…Existing deep clustering approaches generally fall into two categories according to the training strategy: (1) Alternate training [49,46,7,44,6,50,15] and (2) Simultaneous training [23,36,35,16] Alternate training strategy usually estimates the ground-truth membership according to the pretrained or upto-date model and in return supervises the network training by the estimated information. DEC [46] initialises cluster centroids by conducting K-means [32] on pretrained image features and then fine-tunes the model to learn from the confident cluster assignments to sharpen the resulted prediction distribution.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations