Proceedings of the Third International Conference on Computing, Mathematics and Statistics (iCMS2017) 2019
DOI: 10.1007/978-981-13-7279-7_33
|View full text |Cite
|
Sign up to set email alerts
|

BayesRandomForest: An R Implementation of Bayesian Random Forest for Regression Analysis of High-Dimensional Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
9
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 8 publications
1
9
0
Order By: Relevance
“…The random forest (RF) and Naïve Bayes (NB) algorithms were selected based on our references [31,32]. The third algorithm was the Ensemble (of RF and NB), which was selected using majority voting [33,34]. The R code is as shown in Figure 9 below.…”
Section: Methodsmentioning
confidence: 99%
“…The random forest (RF) and Naïve Bayes (NB) algorithms were selected based on our references [31,32]. The third algorithm was the Ensemble (of RF and NB), which was selected using majority voting [33,34]. The R code is as shown in Figure 9 below.…”
Section: Methodsmentioning
confidence: 99%
“…The topics in the classes include sports, politics, religion etc., which is diverse enough. The Precision (P) was used as class-specific index while Recall (R) (also known as sensitivity) is the proportion of the total amount of relevant cases that were actually retrieved [23][24][25][26][27][28][29]. The F 1 is a measure of the accuracy of the test dataset and it is defined as:…”
Section: Performance Evaluation Using 20-newgroup Datasetmentioning
confidence: 99%
“…In this case, the proposed weighted Gini index returns the unweighted Gini index so that the variable is dropped at the splitting stage. The idea behind this is to control the mixture behaviour of hypergeometric distribution [30]. The dominant category determines the estimates of categories probability.…”
Section: Empirical Bayesian Random Forest (Ebrf)mentioning
confidence: 99%
“…The above derivation implies that irrespective of , RF can only guarantee 63% of relevant subset present in the entire predictor space. Thus, for [30] dataset, RF subset size empirical estimate guarantees selection of about 62.8% of relevant variables. Similar, percentile values were observed for the other datasets.…”
Section: Performance Comparisonmentioning
confidence: 99%