2019
DOI: 10.1016/j.neucom.2018.10.015
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised adaptive hashing based on feature clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 24 publications
0
9
0
Order By: Relevance
“…In this section, we verify the effectiveness of our proposed method by carrying out experiments of image retrieval on three real-world image datasets including CIFAR-10 [35], YouTube Faces [36], and MNIST [37]. We compare SCUH with six well-behaved hashing algorithms, including ITQ [17], LGHSR [43], SH [38], SGH [34], AGH [42], DGH [41], PCAH [18], and FCH [30].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we verify the effectiveness of our proposed method by carrying out experiments of image retrieval on three real-world image datasets including CIFAR-10 [35], YouTube Faces [36], and MNIST [37]. We compare SCUH with six well-behaved hashing algorithms, including ITQ [17], LGHSR [43], SH [38], SGH [34], AGH [42], DGH [41], PCAH [18], and FCH [30].…”
Section: Methodsmentioning
confidence: 99%
“…Principal component analysis hashing (PCAH [18]) also acquires compact binary codes and preserve the similarities among neighbors. Later on, feature clustering hashing (FCH [30]) further considers the unbalanced variance distribution and uncertain similarity relations, when learning the projection matrix as in PCAH and PCA-ITQ. To model manifold structure, scalable graph hashing (SGH [19]) and anchor graph hashing (AGH [31]) are devised to capture the geometrical structure within dataset in a Hamming space and show the advantages of unsupervised graph hashing.…”
Section: Introductionmentioning
confidence: 99%
“…Duan et al [45] treated projection and quantization as a whole and solved it by using the minimal reconstruction bias of signals. Yuan et al [46] adopted k-means to quantize codes on each feature that is generated by using feature clustering.…”
Section: Related Workmentioning
confidence: 99%
“…Once the centroid is obtained, the newly extracted features are the distance of any object in the dataset in respect to the k centroids. K-means clustering was used for dimensionality reduction in [15] for image classification and dubbed as Feature Clustering Hashing method. In this work, we have implemented K-means clustering straightforward as proposed in [19] where the number of clusters is provided as new labels used as the new features.…”
Section: K-means Clusteringmentioning
confidence: 99%
“…K-Means clustering method is seldom used for dimensionality reduction, though recently published paper in[15] K-Means was used for hashing clustering to reduce feature dimensionality for image classification. This published work has explained the difference between image clustering and feature clustering for image classification purpose.…”
mentioning
confidence: 99%