2021
DOI: 10.1007/s10994-021-06015-5
|View full text |Cite
|
Sign up to set email alerts
|

Joint optimization of an autoencoder for clustering and embedding

Abstract: Deep embedded clustering has become a dominating approach to unsupervised categorization of objects with deep neural networks. The optimization of the most popular methods alternates between the training of a deep autoencoder and a k-means clustering of the autoencoder’s embedding. The diachronic setting, however, prevents the former to benefit from valuable information acquired by the latter. In this paper, we present an alternative where the autoencoder and the clustering are learned simultaneously. This is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 42 publications
0
8
0
Order By: Relevance
“…For example, we would expect the latent embeddings of analytes with similar cross‐reactive fingerprints to be within the same cluster while analytes with very distinct cross‐reactive fingerprints should appear in different clusters. Furthermore, there are established modifications to the AE architecture, such as Deep Embedded Clustering (DEC), that enables the user to train a model that simultaneously learns cluster assignments and the underlying latent representation to maximize clustering in an unsupervised fashion [97,98] . In differential sensing, the targeted analytes are typically known a priori and this information could be used to generate a latent representation that clusters the targeted number of analytes.…”
Section: Employing More Advanced Machine Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, we would expect the latent embeddings of analytes with similar cross‐reactive fingerprints to be within the same cluster while analytes with very distinct cross‐reactive fingerprints should appear in different clusters. Furthermore, there are established modifications to the AE architecture, such as Deep Embedded Clustering (DEC), that enables the user to train a model that simultaneously learns cluster assignments and the underlying latent representation to maximize clustering in an unsupervised fashion [97,98] . In differential sensing, the targeted analytes are typically known a priori and this information could be used to generate a latent representation that clusters the targeted number of analytes.…”
Section: Employing More Advanced Machine Learningmentioning
confidence: 99%
“…Furthermore, there are established modifications to the AE architecture, such as Deep Embedded Clustering (DEC), that enables the user to train a model that simultaneously learns cluster assignments and the underlying latent representation to maximize clustering in an unsupervised fashion. [97,98] In differential sensing, the targeted analytes are typically known a priori and this information could be used to generate a latent representation that clusters the targeted number of analytes. It is well established that autoencoders are great at anomaly detection, where the traditional approach is to utilize reconstruction-error based scores.…”
Section: Employing More Advanced Machine Learningmentioning
confidence: 99%
“…Additional variations of DEC have been proposed: Xie et al (2016) used a stacked denoising autoencoder (Vincent et al, 2010) in their original implementation, but Min et al (2018) employed autoencoders composed of CNN layers and other architectures. More recently, Chazan et al (2019) developed an approach in which joint clustering is performed with a mixture of autoencoders, each representing a cluster, and Boubekki et al (2021) demonstrated improved performance using a clustering algorithm that is jointly optimized with the embeddings of the autoencoder. Mousavi et al (2019) used DEC to predict whether seismic detections were local or teleseismic, and Snover et al (2021) demonstrated the ability of DEC to cluster anthropogenically generated seismic noise.…”
Section: Deep Embedded Clusteringmentioning
confidence: 99%
“…(2019) developed an approach in which joint clustering is performed with a mixture of autoencoders, each representing a cluster, and Boubekki et al. (2021) demonstrated improved performance using a clustering algorithm that is jointly optimized with the embeddings of the autoencoder.…”
Section: Introductionmentioning
confidence: 99%
“…Recent deep clustering approaches partition high dimension input features by performing clustering and representation learning simultaneously [12]. Owing to the recent success of these methods, we develop a novel training approach that takes inspiration from the Gaussian mixture model (GMM), and utilizes a one layer auto-encoder called the clustering module (CM) [13] for the SSL task. Our method is called Semi-Supervised Clustering Module (SuperCM), which does not rely on a complex training scheme and achieves performance improvement with respect to its supervised-only baseline.…”
Section: Introductionmentioning
confidence: 99%