2019
DOI: 10.1109/access.2019.2916030
|View full text |Cite
|
Sign up to set email alerts
|

Clustering With Orthogonal AutoEncoder

Abstract: Recently, clustering algorithms based on deep AutoEncoder attract lots of attention due to their excellent clustering performance. On the other hand, the success of PCA-Kmeans and spectral clustering corroborates that the orthogonality of embedding is beneficial to increase the clustering accuracy. In this paper, we propose a novel dimensional reduction model, called Orthogonal AutoEncoder (OAE), which encourages the orthogonality of the learned embedding. Furthermore, we propose a joint deep Clustering framew… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
12
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 37 publications
(16 citation statements)
references
References 37 publications
0
12
0
Order By: Relevance
“…However, estimating the network's parameters via the minimization of the reconstruction error L (•) does not guarantee that the features in the latent space are orthogonal to each other, which might provide a better classdiscriminability. Thus, in order to ensure orthogonality among the components in the latent space, we added a regularization term into the reconstruction error L (•), similar to Wang et al (2019). Formally, the orthogonal reconstruction error L (•) can be defined as:…”
Section: Hyperspectral Dimensionality Reduction Via Orthogonal Autoencodersmentioning
confidence: 99%
See 2 more Smart Citations
“…However, estimating the network's parameters via the minimization of the reconstruction error L (•) does not guarantee that the features in the latent space are orthogonal to each other, which might provide a better classdiscriminability. Thus, in order to ensure orthogonality among the components in the latent space, we added a regularization term into the reconstruction error L (•), similar to Wang et al (2019). Formally, the orthogonal reconstruction error L (•) can be defined as:…”
Section: Hyperspectral Dimensionality Reduction Via Orthogonal Autoencodersmentioning
confidence: 99%
“…For example, Chen et al (2014) employed Autoencoders, whereas Liu et al (2018) used Recurrent Neural Networks for hyperspectral image classification tasks. In a similar path, Wang et al (2019) developed a semi-supervised learning method for clustering, which improves class-separability among the latent features via Orthogonal Autoencoders in general classification tasks, however, to the best of our knowledge, Wang's method was never used in the hyperspectral image analysis thus far.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It can capture data distribution through neural networks and be used to generate new samples. Its variants have been widely studied and applied in various fields, such as unsupervised clustering [2] and image generation [3], [4]. VAE [1] belongs to the maximum likelihood generative model.…”
Section: Introductionmentioning
confidence: 99%
“…The enforcement of exact orthogonality also serves as a regularization by shrinking the model space. It eliminates multicollinearity between the basis vectors, and subsequently reduces the risk of overfitting (Abdi and Williams, 2010;Wang et al, 2019). This has advantages for both increasing the classification performance of subsequent supervised learning approaches by using the generated topic vectors as new features, and the interpretation of these topic vectors themselves.…”
mentioning
confidence: 99%