2013
DOI: 10.1007/978-3-642-40328-6_1
|View full text |Cite
|
Sign up to set email alerts
|

Spectral Sparsification in Dynamic Graph Streams

Abstract: Abstract. We present a new bound relating edge connectivity in a simple, unweighted graph with effective resistance in the corresponding electrical network. The bound is tight. While we believe the bound is of independent interest, our work is motivated by the problem of constructing combinatorial and spectral sparsifiers of a graph, i.e., sparse, weighted sub-graphs that preserve cut information (in the case of combinatorial sparsifiers) and additional spectral information (in the case of spectral sparsifiers… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 31 publications
(21 citation statements)
references
References 22 publications
0
21
0
Order By: Relevance
“…This result is part of a growing body of work on processing hypergraphs in the data stream model [12,23,26,27,29]. There are numerous challenges in extending previous work on graph sparsification [3,4,16,19,20] to hypergraph sparsification and we discuss these in Section 5. In the process of overcoming these challenges, we also identify a simpler approach for graph sparsification in the data stream model.…”
Section: Our Contributions and Related Workmentioning
confidence: 96%
See 1 more Smart Citation
“…This result is part of a growing body of work on processing hypergraphs in the data stream model [12,23,26,27,29]. There are numerous challenges in extending previous work on graph sparsification [3,4,16,19,20] to hypergraph sparsification and we discuss these in Section 5. In the process of overcoming these challenges, we also identify a simpler approach for graph sparsification in the data stream model.…”
Section: Our Contributions and Related Workmentioning
confidence: 96%
“…Another downside to the previous approach is that the Fung et al result does not seem to extend to the case of hypergraphs. 4 Using our new-found ability (see the previous section) to find the entire set of edges that are not kstrong, we present an algorithm that a) has a simpler, and almost self-contained, analysis and b) extends to hypergraphs. Our approach is closer in spirit to Benczúr and Karger's original work on sparsification [6] which in turn is based on the following result by Karger [22]: if we sample each edge with probability p ≥ p * = c −2 λ −1 log n where λ is the cardinality of the minimum cut and c ≥ 0 is some constant, and weight the sampled edges by 1/p then the resulting graph is a sparsifier with high probability.…”
Section: Hypergraph Sparsificationmentioning
confidence: 99%
“…The effective resistance r e is a more nuanced quantity than λ e in the sense that λ e only depends on the number of edge-disjoint paths between the endpoints of e whereas the lengths of these paths are also relevant when calculating the effective resistance r e . However, the two quantities are related by the following inequality [7],…”
Section: Min-cut and Sparsificationmentioning
confidence: 99%
“…In this section, we briefly outline how to perform such sampling. We refer the reader to Ahn et al [6,7] for details regarding independence issues and how to reweight the edges. The challenge is that we do not know the values of λ e ahead of time.…”
Section: Min-cut and Sparsificationmentioning
confidence: 99%
“…We refer readers to [13] for an overview of a number of ℓ 0 -samplers under a unified framework. Besides being used in various statistical estimations [14], ℓ 0 -sampling finds applications in dynamic geometric problems (e.g., ϵ-approximation, minimum spanning tree [24]), and dynamic graph streaming algorithms (e.g., connectivity [1], graph sparsifiers [2,3], vertex cover [10,11] maximum matching [1,5,10,30], etc; see [32] for a survey). However, all the algorithms for ℓ 0 -sampling proposed in the literature only work for noiseless streaming datasets.…”
Section: Introductionmentioning
confidence: 99%