2014 17th International Conference on Computer and Information Technology (ICCIT) 2014
DOI: 10.1109/iccitechn.2014.7073129
|View full text |Cite
|
Sign up to set email alerts
|

A comprehensive method for attribute space extension for Random Forest

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(11 citation statements)
references
References 14 publications
0
11
0
Order By: Relevance
“…As noted in the previous works [1][2][3] the enrichment of feature space ensures significant contribution to the classification performance on the numeric data. The studies so far on extended space forests utilize either randomly chosen features [3] or the specific feature selection method such as gain ratio [1-2] to determine new candidate features to be consolidated to the original feature space.…”
Section: Word Embedding (We)mentioning
confidence: 80%
See 3 more Smart Citations
“…As noted in the previous works [1][2][3] the enrichment of feature space ensures significant contribution to the classification performance on the numeric data. The studies so far on extended space forests utilize either randomly chosen features [3] or the specific feature selection method such as gain ratio [1-2] to determine new candidate features to be consolidated to the original feature space.…”
Section: Word Embedding (We)mentioning
confidence: 80%
“…To get higher classification performance of ensemble system, they suggest utilizing the extended space methods. The recent studies [1][2] on extended space decision trees propose to increase the ensemble accuracy by suggesting another approach. Instead of randomly producing, new features with high classification capacity are generated by computing the gain ratio of each different candidate features.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Some of the most popular methods are naïve Bayes classifiers (NB), decision trees (DT), support vector machines (SVM), artificial neural networks (ANN), and k-nearest neighborhood classifiers (k-NN). Among these classifiers, decision trees have been extensively applied in the state of the art studies for ensemble learning [1][2][10][11][12][13][14]. Furthermore, usage of more than one decision tree emerges decision forests for ensemble learning.…”
Section: Introductionmentioning
confidence: 99%