2019 3rd International Conference on Computing Methodologies and Communication (ICCMC) 2019
DOI: 10.1109/iccmc.2019.8819828
|View full text |Cite
|
Sign up to set email alerts
|

Success Prediction using Random Forest, CatBoost, XGBoost and AdaBoost for Kickstarter Campaigns

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 66 publications
(31 citation statements)
references
References 6 publications
0
20
0
1
Order By: Relevance
“…18,19 Several studies have reported higher performance with Adaboost compared to XGBoost, despite the popularity of XGBoost. 20,21,22,23,24,25,26 Compared with the previous studies, this study proposes the following contributions. First, it is a novel approach to apply and compare the extensive range of data-driven machine learning algorithms for nomograms subject to SMILE.…”
Section: Discussionmentioning
confidence: 99%
“…18,19 Several studies have reported higher performance with Adaboost compared to XGBoost, despite the popularity of XGBoost. 20,21,22,23,24,25,26 Compared with the previous studies, this study proposes the following contributions. First, it is a novel approach to apply and compare the extensive range of data-driven machine learning algorithms for nomograms subject to SMILE.…”
Section: Discussionmentioning
confidence: 99%
“…Each model reduces the errors of its previous model until the maximum accuracy is achieved or the number of models increases. It uses the model generated by random forest as the base estimator to generate the best results 40 . A number of excellent models have been derived from the AdaBoost formation idea, including XGBoost, LightGBM, and CatBoost, as shown in Figure 3.…”
Section: Methodsmentioning
confidence: 99%
“…It uses the model generated by random forest as the base estimator to generate the best results. 40 A number of excellent models have been derived from the AdaBoost formation idea, including XGBoost, LightGBM, and CatBoost, as shown in Figure 3.…”
Section: Adaptive Boosting (Adaboost)mentioning
confidence: 99%
“…• Random Forest uses decision trees, which are very prone to success [13]; typically, it is used to achieve higher accuracy, based on different sets of attributes and samples. Random forest is known for fast training, matching speed, high classification accuracy and good performance for high-dimensional input data [14,15].…”
Section: Introductionmentioning
confidence: 99%