1997
DOI: 10.1023/a:1007413323501
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: Abstract. The ability to restructure a decision tree efficiently enables a variety of approaches to decision tree induction that would otherwise be prohibitively expensive. Two such approaches are described here, one being incremental tree induction (ITI), and the other being non-incremental tree induction using a measure of tree quality instead of test quality (DMTI). These approaches and several variants offer new computational and classifier characteristics that lend themselves to particular applications.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2001
2001
2021
2021

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 265 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…The regression tree-based approaches, belonging to supervised learning, are widely taken to SM simulation and estimation since their inception (Han et al, 2018;Im et al, 2016;Wei et al, 2019). Basically, regression tree approach has a tree structure, in which each internal node represents a judgment on attribute, each branch represents an output of fitting model, and each leaf node represents a prediction result (Utgoff et al, 1997). Among the four methods, the CART gets the highest computational efficiency.…”
Section: Methodsmentioning
confidence: 99%
“…The regression tree-based approaches, belonging to supervised learning, are widely taken to SM simulation and estimation since their inception (Han et al, 2018;Im et al, 2016;Wei et al, 2019). Basically, regression tree approach has a tree structure, in which each internal node represents a judgment on attribute, each branch represents an output of fitting model, and each leaf node represents a prediction result (Utgoff et al, 1997). Among the four methods, the CART gets the highest computational efficiency.…”
Section: Methodsmentioning
confidence: 99%
“…Investigating the classifciation ML model accuracy, e.g., the logistic regression (LR), based on recursively removed features and those that remain, recursive feature elimination (RFE) has been used to reliably identify which features contribute the most to predicting the target property. This was corroborated using an alternative method, the random forest (RF) algorithm consists of a “forest” of decision trees (DTs), each node of which splits the dataset based on a condition of a single feature so that similar properties are grouped together. Gini impurity was then used as measurement of how “similar” samples are, and this was calculated for each feature and tree, before averaging across the “forest”.…”
Section: Electronic Structure and Data Analytic Methodsmentioning
confidence: 99%
“…Many classifiers have been proposed to solve it. Decision tree [26] is one of the most widely used machine learning technologies. It represents a tree-like decision procedure for determining the class of a given instance.…”
Section: Fundamental Knowledge On Bagged Treementioning
confidence: 99%