2020
DOI: 10.1007/s13369-020-04380-2
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Filter-Based Feature Selection Model to Identify Significant Features from High-Dimensional Microarray Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(23 citation statements)
references
References 22 publications
0
23
0
Order By: Relevance
“…( BMR's) Raj, D. D., & Mohanasundaram, R. (2020) [11] Developed a new hybrid technique for gene selection known as the ensemble multipopulation adaptive genetic algorithm (EMPAGA), which can ignore irrelevant genes (EMPAGA) Shukla, A. K. (2020). [12] while accurately classifying cancer.…”
Section: Methods Author Ref Nomentioning
confidence: 99%
“…( BMR's) Raj, D. D., & Mohanasundaram, R. (2020) [11] Developed a new hybrid technique for gene selection known as the ensemble multipopulation adaptive genetic algorithm (EMPAGA), which can ignore irrelevant genes (EMPAGA) Shukla, A. K. (2020). [12] while accurately classifying cancer.…”
Section: Methods Author Ref Nomentioning
confidence: 99%
“…N is a normalization function that maintains the W (g) value to be in the interval [−1, 1]. An improvement has been suggested from Raj and Mohanasundaram, they have been proposed a new feature weighting scheme to overcome the common drawbacks of the relief family 20 …”
Section: Materials and The Methodsmentioning
confidence: 99%
“…The results of classification performance successfully selected 12 out of 23 original features as the most significant features with 86.78% of accuracy. Raj and Mohanasundaram [ 44 ] suggests an improved feature weighting algorithm named Boundary Margin Relief (BMR) based on IG, RF, Multi-SURF, and I-Relief to identify significant features from high dimensional microarray datasets with the highest redundancy. This method successfully produces 345 significant features out of 7128 dataset features and improves SVM accuracy by 92.30%.…”
Section: Literature Studymentioning
confidence: 99%
“…The improvement in each classifiers accuracy using ensemble RF, IG, GR, and CS in Lee et al [ 29 ], Santos et al [ 49 ], and Dongare et al [ 16 ] indicate that ensemble filters are capable in selecting significant amount of features by considering dependency between features. Furthermore, the experiment conducted by Raj and Mohanasundaram [ 44 ] clearly shows that the accuracy improved significantly to 92.30% compared to the accuracy with full dataset features. This shows that multi-filters algorithm feature selection is capable to improve the classification accuracy while producing an optimal number of selected features.…”
Section: Literature Studymentioning
confidence: 99%
See 1 more Smart Citation