2021
DOI: 10.1016/j.chemolab.2021.104396
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid ensemble-filter wrapper feature selection approach for medical data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 71 publications
(24 citation statements)
references
References 71 publications
0
24
0
Order By: Relevance
“…Classifiers like Decision Tree (DT), K-Nearest Neighbor (KNN), and SVM are used for the performance evaluation of selected optimal genes. A hybrid ensemble-filter wrapper feature selection approach is described in [15] for medical data classification. In this model, authors have adopted a subset-based filter approach such as Correlation-based Feature Selection (CFS) and Consistency (CONS) and a rank-based filter approach such as chi-square test, information gain, and relieff for initial gene selection.…”
Section: Related Workmentioning
confidence: 99%
“…Classifiers like Decision Tree (DT), K-Nearest Neighbor (KNN), and SVM are used for the performance evaluation of selected optimal genes. A hybrid ensemble-filter wrapper feature selection approach is described in [15] for medical data classification. In this model, authors have adopted a subset-based filter approach such as Correlation-based Feature Selection (CFS) and Consistency (CONS) and a rank-based filter approach such as chi-square test, information gain, and relieff for initial gene selection.…”
Section: Related Workmentioning
confidence: 99%
“…It is mainly utilized on higher dimension information to estimate the efficacy of attributes in classification. IGFR measures the worth of attributes by estimating the IGFR of features regarding the targeted class (Singh and Singh 2021). Indeed, IGFR calculates the number of data needed for predicting the targeted class by knowing the absence or presence of an attribute.…”
Section: Feature Reductionmentioning
confidence: 99%
“…ðReject whereby marginal entropy is embodied as HðÞ and conditional entropy of y shown a is provided as HðcjaÞ: The IGFR is a fast filter-based FS approach whereby the attribute is graded in descending order of IG score and is carefully chosen according to threshold. High IG implies better discriminatory power for making decisions (Singh and Singh 2021).…”
Section: Feature Reductionmentioning
confidence: 99%
“…which is attained by resolving the subsequent quadratic programming problem (QPP) provided in Eq. At this point, 𝑏 signifies the bias, 𝑤 ∈ ℱ stands for the weight vector of optimum hyperplane within the altered feature space ℱ, 𝜉 𝑖 implies the surplus variable representing the error connected to 𝑖 𝑡ℎ instance margin interms of the separate hyperplane [19], 𝐶 implies the regularization parameter that controls the misclassification cost of samples and penalizes 𝜉 𝑖 , 𝐶 > 0. With the overview of non-negative Lagrange multiplier 𝛼 𝑖 , Eq.…”
Section: Image Classificationmentioning
confidence: 99%