2005
DOI: 10.1007/11503415_32
|View full text |Cite
|
Sign up to set email alerts
|

From Graphs to Manifolds – Weak and Strong Pointwise Consistency of Graph Laplacians

Abstract: Abstract. In the machine learning community it is generally believed that graph Laplacians corresponding to a finite sample of data points converge to a continuous Laplace operator if the sample size increases. Even though this assertion serves as a justification for many Laplacianbased algorithms, so far only some aspects of this claim have been rigorously proved. In this paper we close this gap by establishing the strong pointwise consistency of a family of graph Laplacians with datadependent weights to some… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
267
0
2

Year Published

2005
2005
2020
2020

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 243 publications
(271 citation statements)
references
References 6 publications
2
267
0
2
Order By: Relevance
“…In general, from the analysis of Hein et al [10], given the exact tangent space T X α M , the approximation of normal coordinate values based on PCA yields an error of O ( 2 ) where 2 is the radius of N k (X i ). Further, the Hessian corresponding to the fitted local polynomial may deviate from the true Hessian.…”
Section: Regularization On a Point Cloud In R Nmentioning
confidence: 99%
See 1 more Smart Citation
“…In general, from the analysis of Hein et al [10], given the exact tangent space T X α M , the approximation of normal coordinate values based on PCA yields an error of O ( 2 ) where 2 is the radius of N k (X i ). Further, the Hessian corresponding to the fitted local polynomial may deviate from the true Hessian.…”
Section: Regularization On a Point Cloud In R Nmentioning
confidence: 99%
“…For instance, the graph Laplacian matrix is used to measure the pair-wise dissimilarities of the evaluation of a function f on a given point cloud X , and subsequently this can be used for discretized diffusion and regularization of f on X . One way of justifying the use of the graph Laplacian comes from its limit case behavior as |X | → ∞: When the data X is generated from an underlying manifold M , i.e., when the corresponding probability distribution P has support in M , the graph Laplacian converges to the Laplace-Beltrami operator [2,10] that respects only the intrinsic geometry of M . Accordingly, for a large X , the graph Laplacian helps us measure the variation of functions along M and neglect any random perturbations normal to M that might be irrelevant noise.…”
Section: Introductionmentioning
confidence: 99%
“…It is proved that under certain conditions, when the scale of the data set becomes larger and larger, the graph Laplacian will converge to the Laplace Beltrami operator on the data manifold [4], [16]. In summary, using (29) with exponential weights can effectively measure the smoothness of the data assignments with respect to the intrinsic data manifold.…”
Section: Global Regularizationmentioning
confidence: 98%
“…This has been the subject of intensive research over the past few years by various authors [6,11,[24][25][26][27][28][29]. Here we present the main results without detailed mathematical proofs and refer the reader to the above works.…”
Section: Asymptotics Of the Diffusion Mapmentioning
confidence: 99%