2019
DOI: 10.1016/j.neucom.2018.06.077
|View full text |Cite
|
Sign up to set email alerts
|

A survey on Laplacian eigenmaps based manifold learning methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 49 publications
(16 citation statements)
references
References 127 publications
0
16
0
Order By: Relevance
“…Laplacian Eigenmaps [11] (LE) has remarkable properties of preserving local neighborhood structure of data. LE is to construct the relationship between data with local angles and reconstruct the local structure and features of the data by constructing adjacency graph [30]. If two data instances x i and x j are very similar, i and j should be as close as possible in the target subspace after dimensionality reduction.…”
Section: Laplacian Eigenmapsmentioning
confidence: 99%
“…Laplacian Eigenmaps [11] (LE) has remarkable properties of preserving local neighborhood structure of data. LE is to construct the relationship between data with local angles and reconstruct the local structure and features of the data by constructing adjacency graph [30]. If two data instances x i and x j are very similar, i and j should be as close as possible in the target subspace after dimensionality reduction.…”
Section: Laplacian Eigenmapsmentioning
confidence: 99%
“…To find projection of any additional samples, LE needs to be run on all the data together with the additional samples, resulting in considerable computational cost especially when applying it to large scale data pattern recognition. Fortunately, various methods have been developed to mitigate the out-of-sample problem [39]: Linear approximation to LE, Kernel extensions to LE,…”
Section: Out-of-sample Extensionmentioning
confidence: 99%
“…A selected subset of the first several PCs rather than the original spectral vectors is then used for classification; thus, the feature dimension can be significantly reduced. Besides, some manifold learning methods are further developed to analyze the intrinsic features of HSI, including the Laplacian eigenmaps (LEs) [14], locally linear embedding (LLE) [15], and its extension robust local manifold representation (RLMR) [16]. However, these methods only consider the spectral information, leading to some limitations due to a lack of spatial analysis.…”
Section: Introductionmentioning
confidence: 99%