2019
DOI: 10.1109/tnnls.2018.2861209
|View full text |Cite
|
Sign up to set email alerts
|

Spectral Embedded Adaptive Neighbors Clustering

Abstract: Spectral clustering has been widely used in various aspects, especially the machine learning fields. Clustering with similarity matrix and low-dimensional representation of data is the main reason of its promising performance shown in spectral clustering. However, such similarity matrix and low-dimensional representation directly derived from input data may not always hold when the data are high dimensional and has complex distribution. First, the similarity matrix simply based on the distance measurement migh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 137 publications
(33 citation statements)
references
References 18 publications
0
33
0
Order By: Relevance
“…Thus, the overall complexity of the algorithm scales linearly with the number of ALS iterations and with the tensor ranks, and cubically in the problem dimensions. When processing large datasets, the extra complexity could be partially mitigated by applying image segmentation or band selection [6], [37], [38] strategies. This analysis is beyond the scope of the present work and will be addressed in the future.…”
Section: E Computational Complexity Of Algorithmmentioning
confidence: 99%
“…Thus, the overall complexity of the algorithm scales linearly with the number of ALS iterations and with the tensor ranks, and cubically in the problem dimensions. When processing large datasets, the extra complexity could be partially mitigated by applying image segmentation or band selection [6], [37], [38] strategies. This analysis is beyond the scope of the present work and will be addressed in the future.…”
Section: E Computational Complexity Of Algorithmmentioning
confidence: 99%
“…The number of divided blocks K deciding the dimension of extracted feature is chosen from the set of {3, 5,10,15,20,25,30,35,40, 50} in sequence. Figure 10 presents the experimental result.…”
Section: Parameters Setting Discussionmentioning
confidence: 99%
“…These spectra are represented by hundreds of continuous bands that can meticulously describe the characteristics of different materials to recognize their subtle differences [3]. Therefore, owing to this good discriminative property of hyperspectral image, it has been widely used in many remote sensing research fields [4,5], such as image denoising [6,7], hyperspectral unmixing [8,9], band selection [10,11], target detection [12,13], and image classification [14,15]. They all have important practical applications in geological exploration, urban remote sensing and planning management, environment and disaster monitoring, precision agriculture, archaeology, etc.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, it is essential to reduce hyperspectral data. Feature selection has been a hot topic in the field of machine learning [2][3][4], which is viewed as an effective measure in HSI analysis for dimensionality reduction. It removes some redundant information and can also obtain satisfactory results in comparison to the raw data.…”
Section: Introductionmentioning
confidence: 99%