2019 1st International Conference on Control Systems, Mathematical Modelling, Automation and Energy Efficiency (SUMMA) 2019
DOI: 10.1109/summa48161.2019.8947569
|View full text |Cite
|
Sign up to set email alerts
|

Defining the Ranges Boundaries of the Optimal Parameters Values for the Random Forest Classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 8 publications
0
3
0
Order By: Relevance
“…In the modeling stage using the Random Forest algorithm, adjustments were made to several key parameters to enhance model performance (Demidova & Ivkina, 2019;Kurniawati, Novita Nurmala Putri, & Kurnia Ningsih, 2020). The number of decision trees used is 100, and the assigned random state value is 42.…”
Section: Random Forest Algorithmmentioning
confidence: 99%
“…In the modeling stage using the Random Forest algorithm, adjustments were made to several key parameters to enhance model performance (Demidova & Ivkina, 2019;Kurniawati, Novita Nurmala Putri, & Kurnia Ningsih, 2020). The number of decision trees used is 100, and the assigned random state value is 42.…”
Section: Random Forest Algorithmmentioning
confidence: 99%
“…When the classification tests are reached, the classification's final result is determined by a ballot on the DT. Scholars usually start by raising the accuracy of the classifier and then decreasing the interaction among classification models [54]. The final reduction of the classification effect is achieved using the RF method in the classifier, where the outcomes of each base classifier's classification have a similar error distribution.…”
Section: B Random Forests Classifiermentioning
confidence: 99%
“…Examples of classifier methods are [14] Decision Tree (DT), Support Vector Machine (SVM), Artificial Neural Network (ANN), k-Nearest Neighbor (kNN), etc. The classifier needs the correct choice of parameters to produce high quality classification [15]. For example, the SVM classifier uses the kernel function parameter to solve nonlinear multi classification problems [16].…”
Section: Feature Extractor and Classifiermentioning
confidence: 99%