2022
DOI: 10.12688/f1000research.124604.1
|View full text |Cite
|
Sign up to set email alerts
|

The simplicity of XGBoost algorithm versus the complexity of Random Forest, Support Vector Machine, and Neural Networks algorithms in urban forest classification

Abstract: Background: The availability of urban forest is under serious threat, especially in developing countries where urbanization is taking place rapidly. Meanwhile, there are many classifier algorithms available to monitor the extent of the urban forest. However, we need to assess the performance of each classifier to understand its complexity and accuracy. Methods: This study proposes a novel procedure using R language with RStudio software to assess four different classifiers based on different numbers of trainin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…Furthermore, the comprehensive packages and tools provided by RStudio create an appropriate environment for data analysis and machine learning. The XGBoost algorithm is widely recognized for its superior performance on large datasets and its ability to identify complex relationships and interactions, particularly through its gradient boosting method [28][29][30]. In comparison, other commonly used machine learning algorithms, such as Random Forest and Support Vector Machine (SVM), may have slower training times and lower performance when compared to XGBoost.…”
Section: Machine Learning Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, the comprehensive packages and tools provided by RStudio create an appropriate environment for data analysis and machine learning. The XGBoost algorithm is widely recognized for its superior performance on large datasets and its ability to identify complex relationships and interactions, particularly through its gradient boosting method [28][29][30]. In comparison, other commonly used machine learning algorithms, such as Random Forest and Support Vector Machine (SVM), may have slower training times and lower performance when compared to XGBoost.…”
Section: Machine Learning Algorithmmentioning
confidence: 99%
“…In comparison, other commonly used machine learning algorithms, such as Random Forest and Support Vector Machine (SVM), may have slower training times and lower performance when compared to XGBoost. Therefore, based on the data and analysis requirements, it was determined that the combination of RStudio and XGBoost was the most suitable option [28][29][30].…”
Section: Machine Learning Algorithmmentioning
confidence: 99%
“…Abdi [52] found similar OA results across four dates over one site when comparing SVM (0.758), extreme gradient boosting (0.751), RF (0.739), and deep learning (0.733). Ramdani and Furqon [53] found better RMSE values when using XGBoost than Artificial Neural Network (ANN), RF, and SVM (respectively 1.56, 4.33, 6.81, and 7.45) when classifying urban forest at a single site. Georganos et al [54] found that Xgboost, when parameterized with a Bayesian procedure, systematically outperformed RF and SVM, mainly with larger sizes of training data, over three sites.…”
Section: Introductionmentioning
confidence: 99%