2023
DOI: 10.1016/j.ijforecast.2022.02.010
|View full text |Cite
|
Sign up to set email alerts
|

Targeting predictors in random forest regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
21
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 63 publications
(23 citation statements)
references
References 45 publications
1
21
0
1
Order By: Relevance
“…We estimate the outcomes of football matches following the approach of Goller et al variables remain in the model -the full list of variables, as well as those selected, can be found in Appendix C.2. This model selection procedure is consistent with Borup, Christensen, Mühlbach, and Nielsen (2022), who find that for predictive models with many covariates, variable selection benefits prediction accuracy and usually 10-30%…”
Section: A Details To Algorithmsupporting
confidence: 82%
“…We estimate the outcomes of football matches following the approach of Goller et al variables remain in the model -the full list of variables, as well as those selected, can be found in Appendix C.2. This model selection procedure is consistent with Borup, Christensen, Mühlbach, and Nielsen (2022), who find that for predictive models with many covariates, variable selection benefits prediction accuracy and usually 10-30%…”
Section: A Details To Algorithmsupporting
confidence: 82%
“…Finally, it is worth mentioning the interesting work of Borup et al. (2020). In their paper, the authors show that proper predictor targeting controls the probability of placing splits along strong predictors and improves prediction.…”
Section: Nonlinear Modelsmentioning
confidence: 93%
“…Here we can plainly see how decision trees are constructed, with each step in the chain of events leading to the final conclusion. The process of partitioning whole training data into subsets in every internal node depending on some criterion, is required for the construction of a decision tree ( Iwendi et al, 2020 , Borup et al, 2022 ).…”
Section: Methodsmentioning
confidence: 99%
“…Despite the fact that DT are simple to comprehend and perform well in particular data, they have a large variance due to the greedy process of algorithm, which causes tree to constantly select optimal split at each level and can't look beyond the current level. Thus, over fitting may occur, where the model outperforms the testing set in the training set ( Borup et al, 2022 ).…”
Section: Methodsmentioning
confidence: 99%