1992
DOI: 10.1007/bf00992675
|View full text |Cite
|
Sign up to set email alerts
|

Learning conjunctions of Horn clauses

Abstract: Abstract. An algorithm is presented for learning the class of Boolean formulas that are expressible as conjunctions of Horn clauses. (A Horn clause is a disjunction of literals, all but at most one of which is a negated variable.) The algorithm uses equivalence queries and membership queries to produce a formula that is logically equivalent to the unknown formula to be learned. The amount of time used by the algorithm is polynomial in the number of variables and the number of clauses in the unknown formula.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
113
0

Year Published

1998
1998
2020
2020

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 101 publications
(113 citation statements)
references
References 11 publications
0
113
0
Order By: Relevance
“…Moreover, the execution of the repeat loop minimizes the number of ones of the negative example s so that it fulfills the conditions of Lemma 1, and thus allows identifying the antecedent of some clause (or clauses) in F * . Note that a naive approach, consisting of performing a single execution of the inner for loop could fail to achieve that goal, as the following example proves: Suppose the clauses v 1 ∧ v 2 −→ v 4 and v 3 −→ v 2 are in F * , s = 1110 is the negative example selected at line 7, and the for loop checks variables starting by v 1 and ending by v 4 . Then the example obtained would be s = 1010, but the variables in ones(s) would not coincide with the variables of any antecedent in the clauses above.…”
Section: Lemmamentioning
confidence: 97%
See 1 more Smart Citation
“…Moreover, the execution of the repeat loop minimizes the number of ones of the negative example s so that it fulfills the conditions of Lemma 1, and thus allows identifying the antecedent of some clause (or clauses) in F * . Note that a naive approach, consisting of performing a single execution of the inner for loop could fail to achieve that goal, as the following example proves: Suppose the clauses v 1 ∧ v 2 −→ v 4 and v 3 −→ v 2 are in F * , s = 1110 is the negative example selected at line 7, and the for loop checks variables starting by v 1 and ending by v 4 . Then the example obtained would be s = 1010, but the variables in ones(s) would not coincide with the variables of any antecedent in the clauses above.…”
Section: Lemmamentioning
confidence: 97%
“…For instance, the classes monotone DNF [2], k-term DNF or k-clause CNF [1] and read twice DNF [19] have been proved to be learnable using membership and equivalence queries. In [4], an algorithm that learns a subclass of CNF, the class of Horn formulas, is shown. It is known that allowing more than one unnegated literal per clause makes learning much more difficult (or perhaps impossible).…”
Section: Introductionmentioning
confidence: 99%
“…The problem of exactly learning propositional Horn sentences from membership and equivalence queries was shown to have polynomial runtime with a fixed number of variables and a fixed number of clauses [2]. This algorithm was later generalized to learning first-order Horn theories from equivalence and membership queries for several learning settings, including learning from interpretations and learning from entailment [6].…”
Section: Previous Workmentioning
confidence: 99%
“…The membership and equivalence query model has received much attention, due in part to the discovery of quite different interesting and efficient learning algorithms for a wide variety of types of functions [59], [6], [4], [10], [11], [54], [21], [23], [22]. This stands in contrast to the relatively few learning algorithms in the PAC model, or in the model of learning with equivalence queries only, and to the strong negative results that have been given for learning even apparently simple types of functions.…”
Section: Complexity Theory and Learningmentioning
confidence: 99%