1995
DOI: 10.1006/jcss.1995.1031
|View full text |Cite
|
Sign up to set email alerts
|

Approximation Properties of NP Minimization Classes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

1997
1997
2014
2014

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(23 citation statements)
references
References 14 publications
0
23
0
Order By: Relevance
“…The class MIN F + Π 1 was introduced by Kolaitis and Thakur in a framework of syntactically defined classes of optimization problems [26]. In a follow-up paper they showed that every problem in MIN F + Π 1 is constant-factor approximable [27]. We will prove that the standard parameterization of any problem in MIN F + Π 1 admits a polynomial kernelization.…”
Section: Hypergraphs and Sunflowersmentioning
confidence: 88%
See 1 more Smart Citation
“…The class MIN F + Π 1 was introduced by Kolaitis and Thakur in a framework of syntactically defined classes of optimization problems [26]. In a follow-up paper they showed that every problem in MIN F + Π 1 is constant-factor approximable [27]. We will prove that the standard parameterization of any problem in MIN F + Π 1 admits a polynomial kernelization.…”
Section: Hypergraphs and Sunflowersmentioning
confidence: 88%
“…Its superclass MAX NP also contains Max Sat amongst others. Kolaitis and Thakur generalized the approach of examining the logical definability of optimization problems and defined further classes of minimization and maximization problems [26,27]. Amongst others they introduced the class MIN F + Π 1 of problems whose optimum can be expressed as the minimum weight of an assignment (i.e., number of ones) that satisfies a certain universal first-order formula.…”
Section: Kernel Sizementioning
confidence: 99%
“…For instance, maximization problems defined by constraint free NP Datalog queries where negation is only applied to guess atoms or atoms not depending on guess atoms (called deterministic) are constant approximable (Greco and Saccà 1997). Indeed, these problems belong to the class of constant approximable optimization problems MAX Σ 1 3 (Kolaitis and Thakur 1995) and, therefore, NP Datalog could also be used to define the class of approximable optimization problems, but this is outside the scope of this paper.…”
Section: Definitionmentioning
confidence: 99%
“…[Sketch] The proof is based on the logical characterization of optimization problems [16,181, the technique of Papadimitriou and Yannakakis [19], and the techniques we developed in Section 3. I In fact, one can show that the converse of Lemma 5.2 is also true, i.e.…”
Section: Definitionmentioning
confidence: 99%