Proceedings of the 56th Annual Design Automation Conference 2019 2019
DOI: 10.1145/3316781.3317809
|View full text |Cite
|
Sign up to set email alerts
|

Effective-Resistance Preserving Spectral Reduction of Graphs

Abstract: This paper proposes a scalable algorithmic framework for effectiveresistance preserving spectral reduction of large undirected graphs. The proposed method allows computing much smaller graphs while preserving the key spectral (structural) properties of the original graph. Our framework is built upon the following three key components: a spectrum-preserving node aggregation and reduction scheme, a spectral graph sparsification framework with iterative edge weight scaling, as well as effective-resistance preserv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 14 publications
(18 citation statements)
references
References 27 publications
0
18
0
Order By: Relevance
“…In the rest of this paper, we assume that G = (V, E, w) is a weighted, undirected and connected graph, whereas P = (V, E s , w s ) is its sparsifier. To simplify the our analysis, we assume the edge weights in the sparsifier remain the same as the original ones, though the latest iterative edge re-scaling schemes [36] can be applied to further improve the approximation. The descending eigenvalues of L + P L G are denoted by λ max = λ 1 ≥ λ 2 ≥ · · · ≥ λ n ≥ 1, where L + P denotes the Moore-Penrose pseudoinverse of L P .…”
Section: Overview Of Our Approachmentioning
confidence: 99%
“…In the rest of this paper, we assume that G = (V, E, w) is a weighted, undirected and connected graph, whereas P = (V, E s , w s ) is its sparsifier. To simplify the our analysis, we assume the edge weights in the sparsifier remain the same as the original ones, though the latest iterative edge re-scaling schemes [36] can be applied to further improve the approximation. The descending eigenvalues of L + P L G are denoted by λ max = λ 1 ≥ λ 2 ≥ · · · ≥ λ n ≥ 1, where L + P denotes the Moore-Penrose pseudoinverse of L P .…”
Section: Overview Of Our Approachmentioning
confidence: 99%
“…Examples include scientific computing and numerical optimization [8,13,26], graph partitioning and data clustering [15,22], machine learning and data mining [6,14], as well as integrated circuit modeling, simulation and verifications [11,29,30]. In particular, latest theoretical breakthroughs in spectral graph theory have led to the development of nearly-linear time spectral graph sparsification [8,9,16,25] and coarsening algorithms [18,19,31,33]. These techniques can efficiently produce much smaller graphs that well preserve the key spectral properties of the original graph (e.g., the first few eigenvalues and eigenvectors of the graph Laplacian), which in turn has led to much faster algorithms for solving partial differential equations (PDEs) and linear systems of equations [21,25,32], spectral clustering and graph partitioning [9,15,22,31], and dimensionality reduction and data visualization [33].…”
Section: Introductionmentioning
confidence: 99%
“…This paper for the first time introduces a solver-free spectral graph sparsification framework (SF-GRASS) by leveraging emerging spectral graph coarsening [31] and graph signal processing techniques [23]. Our approach first coarsens the original graph into increasingly smaller graphs while preserving the key graph spectral properties.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations