Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms 2019
DOI: 10.1137/1.9781611975482.20
|View full text |Cite
|
Sign up to set email alerts
|

Submodular Function Maximization in Parallel via the Multilinear Relaxation

Abstract: Balkanski and Singer [4] recently initiated the study of adaptivity (or parallelism) for constrained submodular function maximization, and studied the setting of a cardinality constraint. Subsequent improvements for this problem by Balkanski, Rubinstein, and Singer [6] and Ene and Nguyen [21] resulted in a near-optimal (1−1/e−ǫ)-approximation in O(log n/ǫ 2 ) rounds of adaptivity. Partly motivated by the goal of extending these results to more general constraints, we describe parallel algorithms for approxim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
47
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 46 publications
(47 citation statements)
references
References 32 publications
0
47
0
Order By: Relevance
“…The first condition, (1.A), says we do not sample past the point where the margins are decreasing by a lot. A simpler form appears in previous work in the monotone cardinality setting [21]. The second condition (1.B), is a more significant departure from the cardinality setting, and strikes a balance between trying to span many elements, and not having to prune too many of the sampled elements.…”
Section: Greedy Samplingmentioning
confidence: 99%
See 2 more Smart Citations
“…The first condition, (1.A), says we do not sample past the point where the margins are decreasing by a lot. A simpler form appears in previous work in the monotone cardinality setting [21]. The second condition (1.B), is a more significant departure from the cardinality setting, and strikes a balance between trying to span many elements, and not having to prune too many of the sampled elements.…”
Section: Greedy Samplingmentioning
confidence: 99%
“…We can both search for the appropriate value of δ and sample with probability δ in parallel. The basic idea of greedy sampling is directly inspired by a much simpler greedy sampling procedure in the cardinality setting in our previous work [21]. 1 We now iterate along greedy blocks, where each iteration is w/r/t the residual system induced by previously selected greedy blocks.…”
Section: Overview Of Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…Functions with bounded curvature have also been studied using adaptive sampling under a cardinality constraint [BS18b]. The more general family of packing constraints, which includes partition and laminar matroids, has been considered in [CQ19]. In particular, under m packing constraints, a 1 − 1/e − ǫ approximation was obtained in O(log 2 m log n) rounds using a combination of continuous optimization and multiplicative weight update techniques.…”
Section: Introductionmentioning
confidence: 99%
“…The approximation guarantee of [4] was very quickly improved in several independent works [3,20,23] to 1 − 1 /e − ε (using O(ε −2 log n) adaptive rounds), which almost matches an impossibility result by [42] showing that no polynomial time algorithm can achieve (1 − 1 /e + ε)-approximation for the problem, regardless of the amount of adaptivity it uses. It should be noted also that [23] manages to achieve the above parameters while keeping the query complexity linear in n. An even more recent line of work studies algorithms with low adaptivity for more general submodular maximization problems, which includes problems with non-monotone objective functions and/or constraints beyond the cardinality constraint [15,21,22]. Since all these results achieve constant approximation for problems generalizing the maximization of a monotone submodular function subject to a cardinality constraint, they all inherit the impossibility result of [4], and thus, use at least Ω(log n) adaptive rounds.…”
Section: Introductionmentioning
confidence: 99%