Proceedings of the 2016 International Conference on Artificial Intelligence: Technologies and Applications 2016
DOI: 10.2991/icaita-16.2016.48
|View full text |Cite
|
Sign up to set email alerts
|

Random Forest Regression Based on Partial Least Squares Connect Partial Least Squares and Random Forest

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 4 publications
0
4
0
Order By: Relevance
“…The relative variable importance was determined by the Partial Least Squares (PLS) based on the weighted sums of the absolute regression coefficients using the importance function in "Random Forest" packages. PLS is an analytical method applicable when the data include more predictions than observations, and it is used to estimate the importance of a predictor variable in a model regression [55][56][57]. In RF, the most commonly used method to analyze the variable importance is the mean decrease in Gini, where the importance value is expressed as a percentage; this corresponds to the mean variable total decrease in node impurity, weighted by the proportion of samples reaching that node in each decision tree in RF, i.e., normalization of importance values [49].…”
Section: Discussionmentioning
confidence: 99%
“…The relative variable importance was determined by the Partial Least Squares (PLS) based on the weighted sums of the absolute regression coefficients using the importance function in "Random Forest" packages. PLS is an analytical method applicable when the data include more predictions than observations, and it is used to estimate the importance of a predictor variable in a model regression [55][56][57]. In RF, the most commonly used method to analyze the variable importance is the mean decrease in Gini, where the importance value is expressed as a percentage; this corresponds to the mean variable total decrease in node impurity, weighted by the proportion of samples reaching that node in each decision tree in RF, i.e., normalization of importance values [49].…”
Section: Discussionmentioning
confidence: 99%
“…As a nonparametric multivariate statistical analysis method, partial least squares (PLS) provides a regression modeling method for multidependent variables to multi-independent variables in order to effectively solve the problem of multicollinearity [14, 15] when there is a high correlation between independent variables. Based on preponderance, some researchers have proposed a series of improved models.…”
Section: Related Workmentioning
confidence: 99%
“…The idea has been tried in several papers. Reddy et al [26] and Yeh et al [27] sequentially fit PLS regression and regression tree, whereas Hao et al [28] combined PLS regression and regression tree, and additionally combined PLS regression and random forests. All predictors were used for splitting and fitting in the latter, and only univariate response variables were considered in the studies.…”
Section: Introductionmentioning
confidence: 99%