2010
DOI: 10.1007/978-3-642-14400-4_3
|View full text |Cite
|
Sign up to set email alerts
|

Bootstrap Feature Selection for Ensemble Classifiers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0
1

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 14 publications
0
5
0
1
Order By: Relevance
“…And the whole process of experiments was run on Matlab. By using the bootstrap [34], the dataset was divided into 75% training data and 25% testing data at a ratio of 3:1 approximately. Then seven kinds of feature selection methods were chosen to select the different ratios of subsets to perform the experiment (Tables 2, 4, 5, and 6).…”
Section: Methods Of Experimentsmentioning
confidence: 99%
“…And the whole process of experiments was run on Matlab. By using the bootstrap [34], the dataset was divided into 75% training data and 25% testing data at a ratio of 3:1 approximately. Then seven kinds of feature selection methods were chosen to select the different ratios of subsets to perform the experiment (Tables 2, 4, 5, and 6).…”
Section: Methods Of Experimentsmentioning
confidence: 99%
“…For the high-dimensional RSFC, feature reduction is indispensable since redundant or irrelevant information may confound the statistical testing significance (Bunea et al, 2011), worsen the machine learning model performance (Arbabshirani et al, 2017;Duangsoithong et al, 2010) and increase computational complexity. For the regression tasks, correlation analysis between RSFC and the target phonotypic measure (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…Ranking subsets of randomly chosen features before combining was reported in [9]. Bootstrap feature selection for ensembles was proposed in [10].…”
Section: Introductionmentioning
confidence: 99%
“…Ranking subsets of randomly chosen features before combining was reported in [9]. Bootstrap feature selection for ensembles was proposed in [10].The main contributions are 1) feature ranking using ensemble MLP weights combined with RFE 2) OOB stopping criterion for optimal feature selection 3) extension to multi-class problems by combing RFE with weighted ECOC decoding strategy, and 4) incorporation of OOB estimate into ECOC decoding.The paper is organised as follows. In Section 2, six feature ranking strategies are described.…”
mentioning
confidence: 99%