2021
DOI: 10.3390/app11199063
|View full text |Cite
|
Sign up to set email alerts
|

An Optimization Technique for Linear Manifold Learning-Based Dimensionality Reduction: Evaluations on Hyperspectral Images

Abstract: Manifold learning tries to find low-dimensional manifolds on high-dimensional data. It is useful to omit redundant data from input. Linear manifold learning algorithms have applicability for out-of-sample data, in which they are fast and practical especially for classification purposes. Locality preserving projection (LPP) and orthogonal locality preserving projection (OLPP) are two known linear manifold learning algorithms. In this study, scatter information of a distance matrix is used to construct a weight … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…SLPP (Öztürk & Yılmaz, 2021) incorporate the sample category information into the adjacency relationship determination and the corresponding adjacency weight calculation. Specifically, SLPP considers the sample data points of same category as adjacent and assigns them corresponding adjacency weight of W=1/N$$ W=1/N $$, where N is the number of sample points in this category.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…SLPP (Öztürk & Yılmaz, 2021) incorporate the sample category information into the adjacency relationship determination and the corresponding adjacency weight calculation. Specifically, SLPP considers the sample data points of same category as adjacent and assigns them corresponding adjacency weight of W=1/N$$ W=1/N $$, where N is the number of sample points in this category.…”
Section: Methodsmentioning
confidence: 99%
“…SLPP (Öztürk & Yılmaz, 2021) Slime mold algorithm (SMA) (Li et al, 2020) imitates the foraging behavior of slime molds and the establishment of network structures.…”
Section: Region Of Interest Selectionmentioning
confidence: 99%
“…Its representative methods are Principal Component Analysis (PCA) [5,6] and Linear Discriminant Analysis (LDA) [7,8]. Local featurebased manifold learning [9,10] and sparse representation [11] are also important hyperspectral feature extraction methods. At the same time, feature extraction can also combine two different features of space and spectrum for DR [12].…”
Section: Introductionmentioning
confidence: 99%