2020 IEEE International Conference on Big Data (Big Data) 2020
DOI: 10.1109/bigdata50022.2020.9378265
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Unsupervised Learning: Clustering and Classifying using Ultra-Sparse Labels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(14 citation statements)
references
References 5 publications
0
14
0
Order By: Relevance
“…When running our implementation of a VLAE over SVHN we observed that the 3rd layer was associated most clearly with variation in digit identity. In our experiments we ran VLAC with K = K one = [1, 1, 50, 1] and K two = [1,5,50,1]. We evaluate the cluster accuracy of y 3 over the test set, taking as our predictions the argmax of the posterior q φ (y 3 |x).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…When running our implementation of a VLAE over SVHN we observed that the 3rd layer was associated most clearly with variation in digit identity. In our experiments we ran VLAC with K = K one = [1, 1, 50, 1] and K two = [1,5,50,1]. We evaluate the cluster accuracy of y 3 over the test set, taking as our predictions the argmax of the posterior q φ (y 3 |x).…”
Section: Methodsmentioning
confidence: 99%
“…Various deep learning-based algorithms have been proposed for clustering, including: Gaussian Mixture DGMs [3,4,5], GM-VAE [12], VaDE [13], IMSAT [14], DEC [15] and ACOL-GAR [16].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The motivation for our approach is from classification models using variational approximations. (Kingma et al 2014;Maaløe et al 2016;Willetts, Roberts, and Holmes 2020;Xie et al 2021) utilized various variational inference structures on semi-supervised classification tasks. They showed that variational approaches improve the performance for classification tasks, and one of the tasks in temporal point processes is event classification.…”
Section: Background and Related Work Multivariate Temporal Point Processmentioning
confidence: 99%
“…VaDE (Jiang et al, 2017) is the most important prior work, a probabilistic model which has been highly influential in deep clustering. Related approaches, GM-VAEs (Dilokthanakul et al, 2017) and GM-DGMs (Nalisnick et al, 2016;Willetts et al, 2018Willetts et al, , 2020, have similar overall performance and explicitly represent the discrete clustering latent variable during training. Non-parametric approaches include DLDP-MMs (Nalisnick et al, 2016), and HDP-VAEs (Goyal et al, 2017).…”
Section: Related Workmentioning
confidence: 99%