2018
DOI: 10.1007/978-3-030-03146-6_86
|View full text |Cite
|
Sign up to set email alerts
|

A Review on Random Forest: An Ensemble Classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
71
0
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 241 publications
(114 citation statements)
references
References 9 publications
0
71
0
2
Order By: Relevance
“…In combination with ML approaches, we predicted neurotoxin-induced perturbations in hMOs. RF is by design a useful technique for reducing predictive variability, preventing overfitting and achieving high classification accuracy [12]. Importantly, RF gives estimates of which variables are most important in the classification [10].…”
Section: Discussionmentioning
confidence: 99%
“…In combination with ML approaches, we predicted neurotoxin-induced perturbations in hMOs. RF is by design a useful technique for reducing predictive variability, preventing overfitting and achieving high classification accuracy [12]. Importantly, RF gives estimates of which variables are most important in the classification [10].…”
Section: Discussionmentioning
confidence: 99%
“…In combination with a powerful machine learning approach for the analysis of multivariate profiling data, we were able to predict neurotoxin-induced perturbations in the human midbrain organoid system. Random forest by design is a well-established technique for reducing predictive variability, preventing overfitting and achieving high classification accuracy (Parmar et al, 2019). Importantly, random forest gives estimates of which variables are most important in the classification (Breiman, 2001).…”
Section: Discussionmentioning
confidence: 99%
“…A random forest algorithm was chosen for classification because of its higher performance with imbalanced data when compared to other machine learning classifiers [31][32][33][34]. A random forest algorithm is an ensemble classifier that combines a specified number of decision trees and takes the majority decision to predict classification, thus preventing overfitting.…”
Section: Classificationmentioning
confidence: 99%