2011
DOI: 10.1109/lgrs.2011.2128854
|View full text |Cite
|
Sign up to set email alerts
|

Locality-Preserving Discriminant Analysis in Kernel-Induced Feature Spaces for Hyperspectral Image Classification

Abstract: Locality-preserving projection as well as local Fisher discriminant analysis is applied for dimensionality reduction of hyperspectral imagery based on both spatial and spectral information. These techniques preserve the local geometric structure of hyperspectral data into a low-dimensional subspace wherein a Gaussian-mixture-model classifier is then considered. In the proposed classification system, local spatial information-which is expected to be more multimodal than strictly spectral features-is used. Resul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
49
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
9
1

Relationship

2
8

Authors

Journals

citations
Cited by 111 publications
(49 citation statements)
references
References 20 publications
0
49
0
Order By: Relevance
“…However, the high-dimensional nature of HSI data creates complications for k-NN classification in terms of both computational complexity and classification accuracy. Many dimensionality-reducing techniques have been proposed to combat this so-called curse of dimensionality, such as the popular linear discriminant analysis (LDA) [8] and its variants (e.g., [9], [10]). Typically, parametric classification is employed after dimensionality reduction, for example the maximum likelihood estimation (MLE) [11] of posterior probabilities.…”
Section: Introductionmentioning
confidence: 99%
“…However, the high-dimensional nature of HSI data creates complications for k-NN classification in terms of both computational complexity and classification accuracy. Many dimensionality-reducing techniques have been proposed to combat this so-called curse of dimensionality, such as the popular linear discriminant analysis (LDA) [8] and its variants (e.g., [9], [10]). Typically, parametric classification is employed after dimensionality reduction, for example the maximum likelihood estimation (MLE) [11] of posterior probabilities.…”
Section: Introductionmentioning
confidence: 99%
“…Firstly, the 3D-Gabor filters with various frequencies and directions are adopted for converting original HSI into multiple data cubes, which can provide different spectral-spatial information between each other. Sequentially, a cube evaluation criterion, which is motivated by the Fisher's ratio criteria (FR) [44] and conditional mutual information [45], is proposed to calculate the sufficiency and independency of the cubes obtained from the first step. Cubes which have qualified assessment are regarded as views in our proposed method.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…where x i and x j are vectors of observations, σ = sσ 0 , σ 0 is the mean distance between the observations in feature space and s is a scale factor [33,37]. Figures 7c and 12c show the sensitivity of KPCA, KMNF, and OKMNF with respect to s. We can see that both OKMNF and KPCA show better performance than KMNF.…”
Section: Parameter Tuningmentioning
confidence: 95%