2018
DOI: 10.1155/2018/9632569
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Learning Based Multiple Kernel Principal Component Analysis for Dimensionality Reduction and Classification of Hyperspectral Imagery

Abstract: Classification is one of the most challenging tasks of remotely sensed data processing, particularly for hyperspectral imaging (HSI). Dimension reduction is widely applied as a preprocessing step for classification; however the reduction of dimension using conventional methods may not always guarantee high classification rate. Principal component analysis (PCA) and its nonlinear version kernel PCA (KPCA) are known as traditional dimension reduction algorithms. In a previous work, a variant of KPCA, denoted as … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 46 publications
0
5
0
Order By: Relevance
“…To address the abovementioned challenges, dimensionality reduction (DR) [ 9 12 ] and semisupervised classification [ 13 , 14 ] approaches have been extensively adopted for HSIs. Generally, there are two classes of DR, i.e., the band selection and feature extraction [ 15 ].…”
Section: Introductionmentioning
confidence: 99%
“…To address the abovementioned challenges, dimensionality reduction (DR) [ 9 12 ] and semisupervised classification [ 13 , 14 ] approaches have been extensively adopted for HSIs. Generally, there are two classes of DR, i.e., the band selection and feature extraction [ 15 ].…”
Section: Introductionmentioning
confidence: 99%
“…Li X et al proposed a Two-stage Subspace Projection framework, in which they used the KPCA method to carry out feature projection of HSI data [25]. Further improvement methods have been proposed on the basis of KPCA, such as Super pixelwise KPCA [26] and Multiple KPCA based on integrated learning [27].…”
Section: Feature Extraction Methodsmentioning
confidence: 99%
“…The second step is the construction of the covariance matrix of the data, and in the next stage, eigenvectors and eigenvalues of the covariance matrix are calculated by the singular value decomposition (SVD) as follows: cv=λv The bold-italicc is the covariance matrix, bold-italicv and bold-italicλthickmathspace are eigenvectors and eigenvalues of the covariance matrix, respectively. Finally, for data mapping to 2D space, eigenvectors corresponding to the two largest eigenvalues are selected, and by multiplying Anormalised)(x,y by the desired eigenvectors, mapping of data into 2D space is performed. The kernel PCA algorithm : The kernel PCA is the generalisation of the linear PCA that, unlike the standard linear PCA, allows nonlinear dimensionality reduction, which is helpful for complex data that cannot be well represented in a linear space [28]. In fact, in the kernel PCA algorithm, the data is projected into the new space by a non‐linear kernel function. Assume we have a non‐linear transformation φ)(xi which transfers data from D‐dimensional feature space )(Rn to a high‐dimensional space )(Rf being linearly separable in this space.…”
Section: Methodsmentioning
confidence: 99%