2014
DOI: 10.1016/j.protcy.2014.10.232
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble Feature Ranking Applied to Medical Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(10 citation statements)
references
References 25 publications
0
10
0
Order By: Relevance
“…Parkinsons 16 10,21,22,17,20,18,14,9,11,12,13,5,7,4,15,8 The feature reduction achieved for CKD, Breast Cancer Wisconsin Dataset, and Parkinsons Dataset are 24%, 21.8%, and 30.4% respectively. The feature subset derived after the proposed feature selection algorithm consists of the most discriminatory features that enhances the accuracy of prediction of chronic diseases without any loss in the original information.The reduced feature subset is evaluated against three classification algorithms SVM, CNN, and Gradient Boosting.…”
Section: Experiments and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Parkinsons 16 10,21,22,17,20,18,14,9,11,12,13,5,7,4,15,8 The feature reduction achieved for CKD, Breast Cancer Wisconsin Dataset, and Parkinsons Dataset are 24%, 21.8%, and 30.4% respectively. The feature subset derived after the proposed feature selection algorithm consists of the most discriminatory features that enhances the accuracy of prediction of chronic diseases without any loss in the original information.The reduced feature subset is evaluated against three classification algorithms SVM, CNN, and Gradient Boosting.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…The performance of the proposed method is compared with CfsSubsetEval, ConsistencySubsetEval, SF with Entropy and the results reveal that this method better classification accuracy for the reduced feature subset. V´ıtor Santos et al [9] have applied FS techniques to large datasets using Feature Ranking (FR) algorithms. The feature subset is derived using three measures for each class which are statistical between-class distance, interclass overlapping measure and an estimate of class impurity.…”
Section: Literature Reviewmentioning
confidence: 99%
“…All the features remained from different values of k have been used in the next process. In order to assess the performance of selected features from Table 2, random forest classifier [15], [24] was utilized in our experiment. Table 3 shows the classification performance from different values of k. decreased about 0.4% and 1.3% when the k is 5 and 10 respectively.…”
Section: Analysis and Discussionmentioning
confidence: 99%
“…They evaluate their proposed method using two different domain area; image and protein interaction datasets. Due to time and space complexity, Santos et al [15] have come out with ensemble feature ranking strategy to improve the efficiency of the classification performance. They compared the performance of the proposed method with support vector machine (SVM), bagging, random forest (RF) and Naive Bayes (NB) using breast cancer dataset.…”
Section: Feature Selection Algorithmsmentioning
confidence: 99%
“…The results showed that BLAST with fading factors outperformed than BLAST using the window approach. Santos (20) proposed an ensemble feature ranking by using Information Gain, Gain Ratio, Symmetrical Uncertainty, Chi-Square and other classifiers on breast cancer dataset with the best performance of Naïve Bayes having higher AUC and lower FPR. Khan et al (21) presented a technique called as Optimal Tree Ensemble (OTE) by integrating trees that were accurate and diverse.…”
Section: Related Workmentioning
confidence: 99%