2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9207493
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Clustering through Gaussian Mixture Variational AutoEncoder with Non-Reparameterized Variational Inference and Std Annealing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…An alternative simple migration [85] yields a composition of an InfoNCE loss [61] and a clustering one [77]. Compared with the deep generative counterparts [16,36,53,82], contrastive clustering is free from decoding and computationally practical, with guaranteed feature quality.…”
Section: Introductionmentioning
confidence: 99%
“…An alternative simple migration [85] yields a composition of an InfoNCE loss [61] and a clustering one [77]. Compared with the deep generative counterparts [16,36,53,82], contrastive clustering is free from decoding and computationally practical, with guaranteed feature quality.…”
Section: Introductionmentioning
confidence: 99%
“…Although VAE can learn data distributions in the latent space which is good for unsupervised learning tasks, the Gaussian prior may lead to crowding clusters, hindering the subsequent clustering process to effectively separate different groups [34]. Several DC models [35][36][37][38] turn to the Gaussian mixture prior for modeling the discrete clusters in the latent space, which have contributed to a more clustering-oriented VAE framework. In spite of that, vanilla VAE lacks the ability that captures the detailed features of objects and is notorious for blurry output images [39][40][41], which could potentially lead to inferior clustering performance.…”
Section: Introductionmentioning
confidence: 99%