Research and Development in Intelligent Systems XVI 2000
DOI: 10.1007/978-1-4471-0745-3_7
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Induction of Classification Rules from Examples Using N-Prism

Abstract: One of the key technologies of data mining is the automatic induction of rules from examples, particularly the induction of classification rules. Most work in this field has concentrated on the generation of such rules in the intermediate form of decision trees. An alternative approach is to generate modular classification rules directly from the examples. This paper seeks to establish a revised form of the rule generation algorithm Prism as a credible candidate for use in the automatic induction of classifica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
51
0

Year Published

2002
2002
2015
2015

Publication Types

Select...
7

Relationship

4
3

Authors

Journals

citations
Cited by 39 publications
(51 citation statements)
references
References 4 publications
0
51
0
Order By: Relevance
“…'Divide and conquer' based classifiers produce a decision tree, such as Quinlan's C4.5 decision tree induction algorithm [18]; 'separate and conquer' based classifiers produce a set of IF...THEN classification rules, such as the Prism family of algorithms [8,4,5]. As pointed out in [20], most ensemble classifiers are based on the 'divide and conquer' approach even though Prism classifiers have shown to be less vulnerable to overfitting compared with decision tree based classifiers [4]. This is especially the case when confronted with noise and missing values in the data [4].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…'Divide and conquer' based classifiers produce a decision tree, such as Quinlan's C4.5 decision tree induction algorithm [18]; 'separate and conquer' based classifiers produce a set of IF...THEN classification rules, such as the Prism family of algorithms [8,4,5]. As pointed out in [20], most ensemble classifiers are based on the 'divide and conquer' approach even though Prism classifiers have shown to be less vulnerable to overfitting compared with decision tree based classifiers [4]. This is especially the case when confronted with noise and missing values in the data [4].…”
Section: Introductionmentioning
confidence: 99%
“…As pointed out in [20], most ensemble classifiers are based on the 'divide and conquer' approach even though Prism classifiers have shown to be less vulnerable to overfitting compared with decision tree based classifiers [4]. This is especially the case when confronted with noise and missing values in the data [4]. A recently developed ensemble classifier named Random Prism [20], which is inspired by RF and RDF, makes use of the Prism family of algorithms as base classifiers.…”
Section: Introductionmentioning
confidence: 99%
“…'Separate and conquer' can be traced back to Michalski's AQ system in the 1960s [16]. However the most notable development using the 'separate and conquer' approach is the Prism family of algorithms [8,3,4]. It produces modular rules that do not necessarily fit into a decision tree.…”
Section: Introductionmentioning
confidence: 99%
“…Recent developments on the Prism family of algorithms includes frameworks for parallelising Prism algorithms for rule induction on massive datasets [23,25,24] and rule pruning methods in order to reduce overfitting [28,4]. In general Prism algorithms have been shown to be less vulnerable to overfitting compared with decision tree classifiers, especially if there is noise in the data and missing values [3]. Yet most ensemble learning approaches are either based on decision trees or a heterogeneous setup of base classifiers.…”
Section: Introductionmentioning
confidence: 99%
“…The aim is to generate rules with significantly fewer redundant terms than those derived from decision trees. Compared with decision trees Prism [1]:…”
Section: Inducing Modular Classification Rules Using Prismmentioning
confidence: 99%