2020
DOI: 10.1109/jstars.2020.3024241
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Low-Rank Entropic Component Analysis for Hyperspectral Image Classification

Abstract: Principal component analysis (PCA) and its variations are still the primary tool for feature extraction (FE) in remote sensing community. This is unfortunate, as there has been strong argument against using PCA for this purpose due to its inherent linear properties and uninformative principal components. Therefore, several critical issues still should be considered in hyperspectral image classification when using PCA, among which: 1) the large number of spectral channels and small number of training samples; 2… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 38 publications
0
9
0
Order By: Relevance
“…where the eigenvalues l 1 , ⋯, l N corresponding to w 1 , ⋯, w N is reserved as the entropy estimate. In another word, KECA performs feature extraction not by the eigenvectors associated with the top N c eigenvalues but by the axes contributing most to the Reńyi entropy estimate V(p) (Jenssen, 2009;Bai et al, 2020a). This can be concluded as the most distinct difference between KECA and KPCA (Jenssen, 2009).…”
Section: Kernel Entropy Component Analysismentioning
confidence: 99%
See 2 more Smart Citations
“…where the eigenvalues l 1 , ⋯, l N corresponding to w 1 , ⋯, w N is reserved as the entropy estimate. In another word, KECA performs feature extraction not by the eigenvectors associated with the top N c eigenvalues but by the axes contributing most to the Reńyi entropy estimate V(p) (Jenssen, 2009;Bai et al, 2020a). This can be concluded as the most distinct difference between KECA and KPCA (Jenssen, 2009).…”
Section: Kernel Entropy Component Analysismentioning
confidence: 99%
“…The width coefficient, also known as the kernel width s , plays a role in kernel-based methods (Bai et al, 2019;Bai et al, 2020a). We evaluated the impact of the width of KECANet on the selected dataset.…”
Section: Impact Of the Width Coefficientmentioning
confidence: 99%
See 1 more Smart Citation
“…The multiple methods include the AF, 36 PCA, 37 KPCA, 38 MNF, 28 OTVCA, 23 and KLRECA. 11 The default parameters of distinct models are set to be same as the settings shown in the corresponding papers.…”
Section: Comparison Techniques and Evaluation Metricsmentioning
confidence: 99%
“…For the preprocessing step, most well-known methods normalize the input original hyperspectral data without additional denoising treatment. [9][10][11] However, the noisy part of the image gathered by the remote sensing technique may lower the information extraction performance due to the inhibition effect of various known and unknown factors. 12 Some projection-based band reduction techniques, for example, traditional principal component analysis (PCA), 13,14 minimum noise fraction (MNF), 15 and classic PCA with segmented-PCA joined with two-dimensional singular spectrum analysis, 16 linearly project the pretreated HSI into a linear neo-space.…”
Section: Introductionmentioning
confidence: 99%