Comprehensive Chemometrics 2009
DOI: 10.1016/b978-044452701-1.00025-9
|View full text |Cite
|
Sign up to set email alerts
|

Decision Tree Modeling in Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(11 citation statements)
references
References 40 publications
0
11
0
Order By: Relevance
“…Given a set of continuous or categorical independent variables, classification and regression trees (CART) are non-parametric techniques that can explain the response of a dependent variable (Breiman et al, 1984;Moisen, 2008). The initial data are split into subgroups, then the variance of these subgroups is minimized until they reach homogeneity (Brown and Myles, 2009).…”
Section: Decision Trees (Cart)mentioning
confidence: 99%
“…Given a set of continuous or categorical independent variables, classification and regression trees (CART) are non-parametric techniques that can explain the response of a dependent variable (Breiman et al, 1984;Moisen, 2008). The initial data are split into subgroups, then the variance of these subgroups is minimized until they reach homogeneity (Brown and Myles, 2009).…”
Section: Decision Trees (Cart)mentioning
confidence: 99%
“…The model divides the initial database into subgroups with the aim of finding homogeneous groups using variance minimization algorithms [37]. Depending on the type of tree, the common optimization algorithms are: mean-squared error (MSE) for regression tress; Gini's diversity index, twoing rule or deviance for classification trees.…”
Section: Classification and Regression Trees (Cart)mentioning
confidence: 99%
“…Pruning and cross-validation simplifies the growth of tree and improve the stability and predictive accuracy (Kim & Upneja, 2014). An n -fold (with 10 3 − = n ) cross validation is a common empirical approach to optimize tree growth (Brown & Myles, 2009). It is a re-sampling technique that uses multiple random training and validating sub samples.…”
Section: Classification and Regression Treesmentioning
confidence: 99%
“…CART algorithm uses Gini index 2 measure as the splitting criteria and does only binary splitting (Bozkir & Sezer, 2011). Gini index generalizes the variance impurity -the variance of a distribution associated with two classes (Brown & Myles, 2009). This is a recursive process as it repeats until a predefined measure of homogeneity/purity (or other measure of completion) is satisfied.…”
Section: Classification and Regression Treesmentioning
confidence: 99%