2009
DOI: 10.1287/ijoc.1080.0278
|View full text |Cite
|
Sign up to set email alerts
|

An Optimal Constrained Pruning Strategy for Decision Trees

Abstract: T his paper is concerned with the optimal constrained pruning of decision trees. We present a novel 0-1 programming model for pruning the tree to minimize some general penalty function based on the resulting leaf nodes, and show that this model possesses a totally unimodular structure that enables it to be solved as a shortest-path problem on an acyclic graph. Moreover, we prove that this problem can be solved in strongly polynomial time while incorporating an additional constraint on the number of residual le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
6
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…Proposition 1 of (Sherali et al, 2009) showed that the following set of constraints completely characterize the set of valid pruned trees of T .…”
Section: Background and Notationsmentioning
confidence: 99%
See 1 more Smart Citation
“…Proposition 1 of (Sherali et al, 2009) showed that the following set of constraints completely characterize the set of valid pruned trees of T .…”
Section: Background and Notationsmentioning
confidence: 99%
“…By showing that the constraint matrix can be turned into a network matrix form, (Sherali et al, 2009) showed the above integer problem can be solved exactly by linear program relaxation.…”
Section: Background and Notationsmentioning
confidence: 99%
“…This network design problem is often avoided, by, e.g., choosing a binary tree of depth D, for a given value of D. To make this decision more data-dependent, a larger tree is built and pruned afterward, collapsing existing leaf nodes into new ones containing more individuals. See, e.g., Sherali et al (2009) for structural properties of the optimization problem associated with the pruning step. In this way, one obtains a more parsimonious tree, which is expected to perform better for future individuals.…”
Section: Introductionmentioning
confidence: 99%
“…See, e.g., Sherali et al. ( 2009 ) for structural properties of the optimization problem associated with the pruning step. In this way, one obtains a more parsimonious tree, which is expected to perform better for future individuals.…”
Section: Introductionmentioning
confidence: 99%
“…As a common framework for avoiding the problem of overfitting noisy data, pruning algorithms have been proposed for many classification algorithms. In decision tree algorithms [2] [3], there are many classical pruning methods proposed such as Reduced Error Pruning (REP) [4], Pessimistic Error Pruning (PEP) [4], Minimum Error Pruning (MEP) [5], CostComplexity Pruning (CCP) [2] and new pruning methods like k-norm pruning [6], optimal constrained pruning [7] are being proposed continually. For separate-and-conquer rulelearning systems, pruning techniques are also important for noise handling.…”
Section: Introductionmentioning
confidence: 99%