2020
DOI: 10.1145/3414685.3417789
|View full text |Cite
|
Sign up to set email alerts
|

Chordal decomposition for spectral coarsening

Abstract: We introduce a novel solver to significantly reduce the size of a geometric operator while preserving its spectral properties at the lowest frequencies. We use chordal decomposition to formulate a convex optimization problem which allows the user to control the operator sparsity pattern. This allows for a trade-off between the spectral accuracy of the operator and the cost of its application. We efficiently minimize the energy with a change of variables and achieve state-of-the-art results on spectral coarseni… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 51 publications
0
6
0
Order By: Relevance
“…We demonstrate our coarsening method using meshes and complexes and compare it against a baseline. Spectral approximations are evaluated using a generalization of functional maps [Chen et al 2020;Lescoat et al 2020;Liu et al 2019] 𝐶 = 𝑈 𝑇 𝑐 𝑃𝑈 . that are applicable to all Laplacian operators, Ideally 𝐶 should resemble a diagonal matrix.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We demonstrate our coarsening method using meshes and complexes and compare it against a baseline. Spectral approximations are evaluated using a generalization of functional maps [Chen et al 2020;Lescoat et al 2020;Liu et al 2019] 𝐶 = 𝑈 𝑇 𝑐 𝑃𝑈 . that are applicable to all Laplacian operators, Ideally 𝐶 should resemble a diagonal matrix.…”
Section: Discussionmentioning
confidence: 99%
“…A tilde above the respective notations denotes their coarsened versions. They formulate coarsening as an optimization problem subject to various sparsity conditions [Liu et al 2019], by detaching the mesh from the operator [Chen et al 2020], and localizing error computation in order to form a parallelizable strategy [Lescoat et al 2020].…”
Section: Introductionmentioning
confidence: 99%
“…The latter set of methods have been shown to lead to excellent performance and scalability on tasks involving individual shapes, such as computing their Shape‐DNA [RWP06] descriptors, or performing mesh filtering. Similarly, there exist several spectral coarsening and simplification approaches [LJO19, LLT * 20, CLJL20] that explicitly aim to coarsen operators, such as the Laplacian while preserving their low frequency eigenapairs. Unfortunately, these methods typically rely on the eigenfunctions on the dense shapes, while the utility of the former approaches in the context of functional maps has not yet been fully analyzed and exploited, in part, since, as we show below, this requires local approximation bounds.…”
Section: Related Workmentioning
confidence: 99%
“…Similar methods have also been applied to tetrahedral mesh simplification, based on volume, quadric-based, and isosurfacepreserving criteria [Chiang and Lu 2003;Chopra and Meyer 2002;Vo et al 2007]. Recent methods formulate coarsening as an optimization problem subject to various sparsity conditions [Liu et al 2019], by detaching the mesh from the operator [Chen et al 2020] and localizing error computation to form a parallelizable strategy [Lescoat et al 2020]. The cotan Laplacian is a popular choice, and is used, via its functional maps [Ovsjanikov et al 2016], to identify correspondences between partial meshes [Rodolà et al 2017] .…”
Section: Related Workmentioning
confidence: 99%