1997
DOI: 10.1007/bfb0024483
|View full text |Cite
|
Sign up to set email alerts
|

Non-oblivious local search for MAX 2-CCSP with application to MAX DICUT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
19
0

Year Published

2003
2003
2014
2014

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(19 citation statements)
references
References 20 publications
0
19
0
Order By: Relevance
“…This is perhaps the most noteworthy of our algorithms; it proceeds by locally optimizing a smoothed variant of f (S), obtained by biased sampling depending on S. The approach of locally optimizing a modified function has been referred to as "non-oblivious local search" in the literature; e.g., see [2] for a non-oblivious local search 2 5 -approximation for the Max Di-Cut problem. Another (simpler) 2 5 -approximation algorithm for Max DiCut appears in [21]. However, these algorithms do not generalize naturally to ours and the re-appearance of the same approximation factor seems coincidental.…”
Section: Modelmentioning
confidence: 99%
“…This is perhaps the most noteworthy of our algorithms; it proceeds by locally optimizing a smoothed variant of f (S), obtained by biased sampling depending on S. The approach of locally optimizing a modified function has been referred to as "non-oblivious local search" in the literature; e.g., see [2] for a non-oblivious local search 2 5 -approximation for the Max Di-Cut problem. Another (simpler) 2 5 -approximation algorithm for Max DiCut appears in [21]. However, these algorithms do not generalize naturally to ours and the re-appearance of the same approximation factor seems coincidental.…”
Section: Modelmentioning
confidence: 99%
“…Unlike submodular minimization [44,26], submodular function maximization is NP-hard as it generalizes many NP-hard problems, like Max-Cut [19,14] and maximum facility location [9,10,2]. Other than generalizing combinatorial optimization problems like Max Cut [19], Max Directed Cut [4,22], hypergraph cut problems, maximum facility location [2,9,10], and certain restricted satisfiability problems [25,14], maximizing non-monotone submodular functions have applications in a variety of problems, e.g, computing the core value of supermodular games [46], and optimal marketing for revenue maximization over social networks [23]. As an example, we describe one important application in the statistical design of experiments.…”
Section: Introductionmentioning
confidence: 99%
“…Our main technique for the above results is local search. Our local search algorithms are different from the previously used variant of local search for unconstrained maximization of a non-negative submodular function [15], or the local search algorithms used for Max Directed Cut [4,22]. In the design of our algorithms, we also use structural properties of matroids, a fractional relaxation of submodular functions, and a randomized rounding technique.…”
Section: Introductionmentioning
confidence: 99%
“…Unlike submodular minimization [46,26], submodular function maximization is NP-hard as it generalizes many NP-hard problems, like Max-Cut [19,14] and maximum facility location [9,10,2]. Other than generalizing combinatorial optimization problems like Max Cut [19], Max Directed Cut [4,22], hypergraph cut problems, maximum facility location [2,9,10], and certain restricted satisfiability problems [25,14], maximizing non-monotone submodular functions has applications in a variety of problems, e.g, computing the core value of supermodular games [48], and optimal marketing for revenue maximization over social networks [23]. As an example, we describe one important application in the statistical design of experiments.…”
mentioning
confidence: 99%