2015
DOI: 10.48550/arxiv.1511.03137
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

k-way Hypergraph Partitioning via n-Level Recursive Bisection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…This is achieved by carefully parallelizing and engineering each phase of the multilevel framework. To further improve solution quality, future work includes parallelizing flow-based refinement techniques [20,22,26] as well as KaHyPar's n-level hierarchy approach [3,52]. Additionally, we want to speed up our FM implementation, and improve the solution quality of label propagation, e.g., using hill-scanning [36] or caching negative gain moves to enable otherwise infeasible high-gain moves.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This is achieved by carefully parallelizing and engineering each phase of the multilevel framework. To further improve solution quality, future work includes parallelizing flow-based refinement techniques [20,22,26] as well as KaHyPar's n-level hierarchy approach [3,52]. Additionally, we want to speed up our FM implementation, and improve the solution quality of label propagation, e.g., using hill-scanning [36] or caching negative gain moves to enable otherwise infeasible high-gain moves.…”
Section: Discussionmentioning
confidence: 99%
“…We compute initial k-way partitions via multilevel recursive bipartitioning using our parallel coarsening and refinement. Initial 2-way partitions on the coarsest hypergraphs are computed with the same portfolio of algorithms that is used in KaHyPar [52]. The portfolio contains 9 different sequential algorithms, including greedy hypergraph growing [12], random assignment, hypergraph growing with label propagation, and alternating BFS [25].…”
Section: Initial Partitioningmentioning
confidence: 99%
“…This process is repeated until the algorithm can no longer find any non-singleton clusters or the number of hypernodes drops below 160k. The second condition is the same stopping criteria used in [62].…”
Section: Acyclic Coarseningmentioning
confidence: 99%
“…k-way FM Refinement. The k-way FM refinement aims to improve a given k-way partition and is based on the k-way FM refinement algorithm implemented in KaHyPar [62]. The algorithm maintains k PQs, one queue for each block.…”
Section: Acyclic Refinementmentioning
confidence: 99%