2002
DOI: 10.1007/3-540-45675-9_2
|View full text |Cite
|
Sign up to set email alerts
|

Pre-pruning Classification Trees to Reduce Overfitting in Noisy Domains

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 3 publications
0
7
0
Order By: Relevance
“…Overfitting can lead to an excessively large number of rules, many of which have very little predictive value for unseen data [14]. There are two techniques for pruning pre-pruning and post-pruning, which are discussed next in this paper.…”
Section: Pruning Methods For Decision Treementioning
confidence: 99%
“…Overfitting can lead to an excessively large number of rules, many of which have very little predictive value for unseen data [14]. There are two techniques for pruning pre-pruning and post-pruning, which are discussed next in this paper.…”
Section: Pruning Methods For Decision Treementioning
confidence: 99%
“…Even though the decision tree generated by the J48, J48graft was accurate and efficient, but they result in bulky trees leading to problem of overfitting [10]. Overfitting results in decision trees that are more complex than necessary.…”
Section: Pruning Methods For Decision Treesmentioning
confidence: 99%
“…In order to illustrate the algorithm DDPA more clearly, let's look at the following example. Table 2, where the condition attributes are Outlook, Tem(temperature), Hum(humidity) and Windy, the decision attribute is d. We use the algorithm DDPA, ID3 [25] and a pre-pruning algorithm J-pruning [26] to create three decision trees as shown in Fig. 8, Fig.…”
Section: Data-driven Decision Tree Learning Algorithm Based On Knowlementioning
confidence: 99%
“…In order to test the performance of the algorithm DDPA proposed in this paper, we compare it with the pre-pruning algorithm J-pruning [26] and post-pruning method reduced-error pruning (REP) [39].…”
Section: Bperformance Testmentioning
confidence: 99%