2022
DOI: 10.1007/s00205-022-01770-8
|View full text |Cite
|
Sign up to set email alerts
|

From Graph Cuts to Isoperimetric Inequalities: Convergence Rates of Cheeger Cuts on Data Clouds

Abstract: In this work we study statistical properties of graph-based clustering algorithms that rely on the optimization of balanced graph cuts, the main example being the optimization of Cheeger cuts. We consider proximity graphs built from data sampled from an underlying distribution supported on a generic smooth compact manifold $${\mathcal {M}}$$ M . In this setting, we obtain high probability convergence rates for both the Cheeger constant and the associated Cheeger cuts towards their… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 59 publications
0
4
0
Order By: Relevance
“…In a slightly different direction, quantitative stability estimates for critical points have been addressed for the isoperimetric inequality on Euclidean space [23,38] and for the Sobolev inequality [22,28]. Quantitative stability estimates have wideranging applications to contexts including characterization of minimizers in variational problems [21], rates of convergence of PDE [16], regularity of interfaces in free boundary problems [2], and even data science [35]. Apart from [18], all of these results make crucial use of the explicit form of minimizers and critical points or of the symmetries of the ambient space.…”
Section: Moreovermentioning
confidence: 99%
See 1 more Smart Citation
“…In a slightly different direction, quantitative stability estimates for critical points have been addressed for the isoperimetric inequality on Euclidean space [23,38] and for the Sobolev inequality [22,28]. Quantitative stability estimates have wideranging applications to contexts including characterization of minimizers in variational problems [21], rates of convergence of PDE [16], regularity of interfaces in free boundary problems [2], and even data science [35]. Apart from [18], all of these results make crucial use of the explicit form of minimizers and critical points or of the symmetries of the ambient space.…”
Section: Moreovermentioning
confidence: 99%
“…It is clear that F (0) = 0 since N (0) = 0. To see that ∇F (0) = 0, we appeal to (38) with ϕ = 0 and see that it suffices to show that π K ⊥ ∇N −1 (0)[η] = 0 for any η ∈ K. And indeed, by (35), we see that ∇ B N (v) maps K to K and that ∇ B N (v)| K = (∇N −1 (0)) −1 | K = Id. Thus ∇F (0) = 0.…”
Section: Local Quantitative Stability Of Minimizersmentioning
confidence: 99%
“…An interesting consequence of our Gamma-convergence result is the convergence of adversarial training (1.1) as ε → 0 to a solution of the problem with ε = 0 with minimal perimeter. Furthermore, the primary result in the continuum setting can be used to recover a Gamma-convergence result for graph discretizations of the nonlocal perimeter; a setting that is especially relevant in the context of graph-based machine learning [33,34,36]. Our approach for this discrete to continuum convergence is in the spirit of these works and relies on T L p (transport L p ) spaces which were introduced for the purpose of proving Gammaconvergence of a graph total variation by García Trillos and Slepčev in [34], but have also been used to prove quantitative convergence statements for graph problems, see, e.g., the works [14,15] by Calder et al…”
Section: Introductionmentioning
confidence: 99%
“…This has motivated theoretical works like [ 67 ] which study the convergence of graph total variation to a continuum weighted total variation (the same paper proposed a topology to study the convergence that didn’t require regularity—in particular pointwise evaluation—of the continuum function). Total variation functionals are also widely used for clustering and segmentation such as in graph cut methods, for example ratio or Cheeger cuts [ 68 , 69 ], graph modularity clustering [ 70 , 71 ], and Ginzburg–Landau segmentation [ 72 – 74 ].…”
Section: Introductionmentioning
confidence: 99%