Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery &Amp; Data Mining 2021
DOI: 10.1145/3447548.3467367
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Framework for Balancing Submodularity and Cost

Abstract: In the classical selection problem, the input consists of a collection of elements and the goal is to pick a subset of elements from the collection such that some objective function f is maximized. This problem has been studied extensively in the data-mining community and it has multiple applications including influence maximization in social networks, team formation and recommender systems. A particularly popular formulation that captures the needs of many such applications is one where the objective function… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(10 citation statements)
references
References 29 publications
0
7
0
Order By: Relevance
“…In particular, they described algorithms with optimal approximation guarantees for this problem when g is a non-negative monotone submodular function, ℓ is a linear function and the optimization is subject to either a matroid or a cardinality constraint. 1 Later works obtained faster and semi-streaming algorithms for the same setting [7,10,11,14]. However, in contrast to all these (often tight) results for monotone submodular functions g, much less is known about the case of non-monotone submodular functions.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, they described algorithms with optimal approximation guarantees for this problem when g is a non-negative monotone submodular function, ℓ is a linear function and the optimization is subject to either a matroid or a cardinality constraint. 1 Later works obtained faster and semi-streaming algorithms for the same setting [7,10,11,14]. However, in contrast to all these (often tight) results for monotone submodular functions g, much less is known about the case of non-monotone submodular functions.…”
Section: Introductionmentioning
confidence: 99%
“…Is there an online algorithm that works for general non-monotone and non-positive? We note that the semi-streaming algorithms studied by Kazemi et al [Kaz+21] and Nikolakaki et al [NET21] provide a (0.5, 1)-approximation algorithm for RegularizedUSM when is monotone. For non-monotone USM, simply selecting each element of with probability 0.5 achieves a 0.25-approximation [FMV11].…”
Section: A Appendixmentioning
confidence: 88%
“…In particular, [9] demonstrated the possiblity of a greater-than-1/2-approximation in expectation, [10] precisely achieved a (1 − 1/e − )-approximation in O(k2 poly(1/ ) ) memory, and most recently [11] achieved the same constant-factor approximation in improved O(k/ ) memory. Motivated by applications to team formation, [12] introduced a simple and efficient streaming algorithm which approximates an unconstrained, cost-diminished version of (1) for arbitrary D and submodular, nonnegative f in O(k) memory. Though our optimization objective is fundamentally different from theirs (we optimize a submodular function while their objective is the difference between a submodular value function and an additive cost function), our selection rule and theoretical guarantees are inspired by their approach.…”
Section: Related Workmentioning
confidence: 99%