1998
DOI: 10.1109/34.709601
|View full text |Cite
|
Sign up to set email alerts
|

The random subspace method for constructing decision forests

Abstract: Much of previous attention on decision trees focuses on the splitting criteria and optimization of tree sizes. The dilemma between overfitting and achieving maximum accuracy is seldom resolved. A method to construct a decision tree based classifier is proposed that maintains highest accuracy on training data and improves on generalization accuracy as it grows in complexity. The classifier consists of multiple trees constructed systematically by pseudorandomly selecting subsets of components of the feature vect… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
660
0
11

Year Published

2005
2005
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 5,260 publications
(672 citation statements)
references
References 22 publications
1
660
0
11
Order By: Relevance
“…In Random forest, there are many decision trees. For a given input, each of the decision trees classify it as yes/no (in case of binary classification) [24] [25]. Then once each of the trees have classified as yes/no, the value which has the majority among them is taken as output.…”
Section: Prediction Models For Distinguishing Early Pd and Healthymentioning
confidence: 99%
“…In Random forest, there are many decision trees. For a given input, each of the decision trees classify it as yes/no (in case of binary classification) [24] [25]. Then once each of the trees have classified as yes/no, the value which has the majority among them is taken as output.…”
Section: Prediction Models For Distinguishing Early Pd and Healthymentioning
confidence: 99%
“…To ensure a necessary assumption, which is the disparity among the classifiers, variability in the training process must be somehow acquired. Many methods have been invented for this purpose, such as random subspace ensemble [13], random forests [14], bagging [15], boosting (e.g. AdaBoost [16]), rotation forests [17] and others.…”
mentioning
confidence: 99%
“…[15] The extension combines Breiman's "bagging" idea and random selection of features, introduced first by Ho [1] and later independently by Amit and Geman [16] in order to construct a collection of decision trees with controlled variance.…”
Section: A Random Forestmentioning
confidence: 99%
“…As a result of this randomness, the bias of the forest usually slightly increases (with respect to the bias of a single nonrandom tree) but, due to averaging, its variance also decreases, usually more than compensating for the increase in bias, hence yielding an overall better mode [1] [18].…”
Section: Advances In Engineering Research Volume 119mentioning
confidence: 99%
See 1 more Smart Citation