2005
DOI: 10.1007/11564126_34
|View full text |Cite
|
Sign up to set email alerts
|

ISOLLE: Locally Linear Embedding with Geodesic Distance

Abstract: Abstract. Locally Linear Embedding (LLE) has recently been proposed as a method for dimensional reduction of high-dimensional nonlinear data sets. In LLE each data point is reconstructed from a linear combination of its n nearest neighbors, which are typically found using the Euclidean Distance. We propose an extension of LLE which consists in performing the search for the neighbors with respect to the geodesic distance (ISOLLE). In this study we show that the usage of this metric can lead to a more accurate p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 10 publications
0
10
0
Order By: Relevance
“…The algorithm includes five general steps as follows. Research has shown that the Geodesic distance is superior to the default setting of Euclidean distance in LLE, and can eliminate the "short circuit" problem and lead to a more faithful representation of the global structure of the underlying manifold [56]. The Geodesic distance is approximated as the length of shortest path between a pair of data points in a weighted graph đș ]^& , which can be computed as in [57].…”
Section: Linear Neighborhood Propagationmentioning
confidence: 99%
“…The algorithm includes five general steps as follows. Research has shown that the Geodesic distance is superior to the default setting of Euclidean distance in LLE, and can eliminate the "short circuit" problem and lead to a more faithful representation of the global structure of the underlying manifold [56]. The Geodesic distance is approximated as the length of shortest path between a pair of data points in a weighted graph đș ]^& , which can be computed as in [57].…”
Section: Linear Neighborhood Propagationmentioning
confidence: 99%
“…The premise of manifold learning is based on a priori assumptions. Different from the previous works, we explore the geometry of data distribution through the manifold constraint technique by presenting a neighbor-preserving embedding algorithm [40] and then utilize it as an added regularization term to discover the manifold estimates. This formalization method is similar to LLE [41] in calculating the weights in dimension reduction.…”
Section: Related Workmentioning
confidence: 99%
“…As described in LLE algorithm [28], a focal point on a manifold can be expressed by a small and dense set of K nearest neighbors to approximate ISOMAP(Isometric Feature Mapping) [42]. Obviously, the Geodesic distance which is employed in ISOMAP [40] is another available technique to detect the neighbors in a linear embedding.…”
Section: Related Workmentioning
confidence: 99%
“…Differently, in our method data is embedded into a manifold not known a priori but learned from the very same data. Now we formalize the manifold by introducing a neighbor-preserving embedding [20,5], which aims to find an estimation of the manifold. Such a formalization is similar to [5] which calculates the weights in the process of dimension reduction by LLE [18].…”
Section: Meadmmmentioning
confidence: 99%
“…As shown in LLE [21], a local point on a manifold can be represented by a small and compact set of K nearest neighbors to approximate ISOMAP. In [20], it has been shown that the Geodesic distance used in ISOMAP is another effective way to locate the neighbors for a linear embedding. We first follow the idea in [15] to estimate the manifold dimension by PCA.…”
mentioning
confidence: 99%