2020
DOI: 10.1007/s10489-020-01797-y
|View full text |Cite
|
Sign up to set email alerts
|

Autoencoder-based unsupervised clustering and hashing

Abstract: Faced with a large amount of data and high-dimensional data information in a database, the existing exact nearest neighbor retrieval methods cannot obtain ideal retrieval results within an acceptable retrieval time. Therefore, researchers have begun to focus on approximate nearest neighbor retrieval. Recently, the hashing-based approximate nearest neighbor retrieval method has attracted increasing attention because of its small storage space and high retrieval efficiency. The development of neural networks has… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(4 citation statements)
references
References 51 publications
0
4
0
Order By: Relevance
“…Finally, a major breakthrough was made in clustering with the characteristics obtained through the adversarial loss constraint encoder and discrimination algorithm. In 2021, Zhang and Qian [7] proposed an unsupervised deep hashing method for large-scale data retrieval called autoencoder-based unsupervised clustering and hashing (AUCH). AUCH can unify unsupervised clustering and retrieval tasks into a single learning model.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, a major breakthrough was made in clustering with the characteristics obtained through the adversarial loss constraint encoder and discrimination algorithm. In 2021, Zhang and Qian [7] proposed an unsupervised deep hashing method for large-scale data retrieval called autoencoder-based unsupervised clustering and hashing (AUCH). AUCH can unify unsupervised clustering and retrieval tasks into a single learning model.…”
Section: Related Workmentioning
confidence: 99%
“…(3) deep clustering via joint convolutional autoencoder [12] 0.392 0.315 0.206 0.210 0.400 0.219 0.350 0.3864 DEC [13] 0.843 0.441 0.244 0.359 0.460 0.257 0.426 0.4540 DEPICT [18] 0.917 0.964 0.212 0.224 0.455 0.243 0.324 0.4130 DAC [41] 0.977 0.653 0.521 0.469 0.307 0.236 0.312 0.3250 Deepcluster [14] --0.376 0.332 ----IIC [8] 0.992 -0.617 0.596 ----DCCS [6] 0.989 -0.656 0.536 ----AUCH [7] 0.960 0.775 0.318 0.734 ----GFDC 0.993 0.974 0.615 0.720 0.902 0.833 0.993 0.9520 Note: "**" denotes the clustering accuracy provided by a previous study, "-" denotes no value available, and the best results are emphasized in bold.…”
Section: Comparison With the State-of-the-artmentioning
confidence: 99%
See 1 more Smart Citation
“…The idea of autoencoders exists for more than 30 years [6] and the applications are presently widespread. They range from generalization to classification tasks, denoising, anomaly detection, recommender systems, clustering and dimensionality reduction with stunning results [7,[9][10][11][12][13]. Within this work, we focus on the latter two use cases, wherein autoencoders perform unsupervised feature extraction and dimensionality reduction [14,15].…”
Section: Why Are Autoencoders Interesting?mentioning
confidence: 99%