Machine Learning Proceedings 1991 1991
DOI: 10.1016/b978-1-55860-200-7.50080-5
|View full text |Cite
|
Sign up to set email alerts
|

An Investigation of Noise-Tolerant Relational Concept Learning Algorithms

Abstract: We discuss the types of noise that may occur in relational learning systems and describe two approaches to addressing noise in a relational concept learning algorithm. We then evaluate each approach expximentally.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
35
0

Year Published

1994
1994
2013
2013

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(36 citation statements)
references
References 8 publications
1
35
0
Order By: Relevance
“…Examples of such systems include: (i) Reduced Error Pruning (REP) (Brunk et al, 1991), which incorporates an adaptation of decision tree pruning; (ii) Incremental Reduced Error Pruning (IREP) (Fürnkranz et al, 1994), an enhancement over REP, (iii) Repeated Incremental Pruning to Produce Error Reduction (RIPPER) (Cohen, 1995), a further enhancement over IREP, and (iv) Swap-1 (Weiss et al, 1993). All these systems use the covering algorithm for rule learning, shown in Figure 1, whereby rules are "learned" sequentially based on training examples.…”
Section: Previous Workmentioning
confidence: 99%
“…Examples of such systems include: (i) Reduced Error Pruning (REP) (Brunk et al, 1991), which incorporates an adaptation of decision tree pruning; (ii) Incremental Reduced Error Pruning (IREP) (Fürnkranz et al, 1994), an enhancement over REP, (iii) Repeated Incremental Pruning to Produce Error Reduction (RIPPER) (Cohen, 1995), a further enhancement over IREP, and (iv) Swap-1 (Weiss et al, 1993). All these systems use the covering algorithm for rule learning, shown in Figure 1, whereby rules are "learned" sequentially based on training examples.…”
Section: Previous Workmentioning
confidence: 99%
“…Examples of post pruning algorithms include REP (Reduced Error Pruning algorithm) [2], GROW [5], SSRR [20] and hybrid and incremental post pruning techniques [24].…”
Section: Rule Pruningmentioning
confidence: 99%
“…The first stage is a greedy process which constructs an initial rule set. This stage is based on an earlier rule-learning algorithm called incremental reduced error pruning (IREP) [Fürnkranz and Widmer 1994], which in turn is based on earlier work due to Quinlan [1990], Cohen [1993], Brunk and Pazzani [1991], and Pagallo and Haussler [1990]. The second stage is an "optimization" phase which attempts to further improve the compactness and accuracy of the rule set.…”
Section: Rippermentioning
confidence: 99%