2017 Innovations in Power and Advanced Computing Technologies (I-Pact) 2017
DOI: 10.1109/ipact.2017.8244875
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid information gain based fuzzy roughset feature selection in cancer microarray data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 23 publications
0
5
0
Order By: Relevance
“…Typically, a higher gain value corresponds to a lower entropy. IG has been used extensively in several applications for the diagnosis of cancerous diseases, such as in filtering informative genes for precise cancer classification [201], selecting breast cancer treatment factors based on the entropy formula [202], analyzing and classifying medical data of breast cancer [203], reducing the dimensionality of genes in multiclass cancer microarray gene expression datasets [204], and filtering irrelevant and redundant genes of cancer [201]. In [205], IG is utilized as a feature selection technique to eliminate redundant and irrelevant symptoms in datasets related to diabetes, breast cancer, and heart disease.…”
Section: Feature Selection Methods (Fs)mentioning
confidence: 99%
“…Typically, a higher gain value corresponds to a lower entropy. IG has been used extensively in several applications for the diagnosis of cancerous diseases, such as in filtering informative genes for precise cancer classification [201], selecting breast cancer treatment factors based on the entropy formula [202], analyzing and classifying medical data of breast cancer [203], reducing the dimensionality of genes in multiclass cancer microarray gene expression datasets [204], and filtering irrelevant and redundant genes of cancer [201]. In [205], IG is utilized as a feature selection technique to eliminate redundant and irrelevant symptoms in datasets related to diabetes, breast cancer, and heart disease.…”
Section: Feature Selection Methods (Fs)mentioning
confidence: 99%
“…H(Y) � − 􏽘 y∈Y p(y)log 2 (p(y)), (2) where x ∈ X, y ∈ Y, and p(x), p(y) represent the marginal probability of x and y, respectively. e conditional entropy and information gain [6][7][8] of X versus Y can be calculated in terms of the first two following formulas, respectively. e information gain ratio of X versus Y is the ratio of the information gain to the information entropy, which is formulated in the last following formula.…”
Section: Information Gain Ratiomentioning
confidence: 99%
“…Step (8). Generate a random r. If r is less than or equal to the mutation probability p m , then randomly select one chromosome from P 3 to perform the mutation operator described in Section 5.3. e final population is marked as P 4 .…”
Section: Detailed Steps Of Igrdcgamentioning
confidence: 99%
“…SU with harmony search algorithm is proposed for microarray data classification in Shreem et al (2016). IG with fuzzy rough sets and GA is proposed for cancer microarray data classification in Chinnaswamy and Srinivasan (2017) whereas IG with Binary Differential Evolution (BDE) is proposed for microarray data in Apolloni et al (2016). A hybrid approach that uses CFS filtering with iterative binary PSO is proposed in Jain et al (2018).…”
Section: Related Workmentioning
confidence: 99%