2021 IEEE 11th Annual Computing and Communication Workshop and Conference (CCWC) 2021
DOI: 10.1109/ccwc51732.2021.9376077
|View full text |Cite
|
Sign up to set email alerts
|

Outlier Prediction Using Random Forest Classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 6 publications
0
10
0
Order By: Relevance
“…In order to build tree-based models, such as RF, it is necessary to take samples from the dataset, select fewer attributes, and identify the value that best splits the dataset. [16] Different decision trees, such as the ones in figure 6, are combined to form an RF classifier. The wisdom of crowds, a straightforward yet potent theory, is the foundation of RF, and it states that when many unrelated individuals participate as a committee to make a forecast, the outcome is more likely to be accurate than if only one person made it.…”
Section: Methodsmentioning
confidence: 99%
“…In order to build tree-based models, such as RF, it is necessary to take samples from the dataset, select fewer attributes, and identify the value that best splits the dataset. [16] Different decision trees, such as the ones in figure 6, are combined to form an RF classifier. The wisdom of crowds, a straightforward yet potent theory, is the foundation of RF, and it states that when many unrelated individuals participate as a committee to make a forecast, the outcome is more likely to be accurate than if only one person made it.…”
Section: Methodsmentioning
confidence: 99%
“…Random forests (RF) are a type of ensemble learning method for classification, regression, and other problems that work by building a large number of DT during training [23]. Regression and classification are only two of the many issues that can be solved using the potent ML method known as RF.…”
Section: K-neighbors Classifiermentioning
confidence: 99%
“…Supervised outlier detection methods require the labeling of anomalous training data. Examples are given by [23,37] who train a RF classifier on labeled outliers. In contrast, unsupervised methods do not require the labeling of anomalies.…”
Section: Related Workmentioning
confidence: 99%