2017
DOI: 10.1016/j.ins.2017.08.036
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection by optimizing a lower bound of conditional mutual information

Abstract: A unified framework is proposed to select features by optimizing computationally feasible approximations of high-dimensional conditional mutual information (CMI) between features and their associated class label under different assumptions. Under this unified framework, state-of-the-art information theory based feature selection algorithms are rederived, and a new algorithm is proposed to select features by optimizing a lower bound of the CMI with a weaker assumption than those adopted by existing methods. The… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
19
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
8
2

Relationship

1
9

Authors

Journals

citations
Cited by 62 publications
(23 citation statements)
references
References 41 publications
1
19
0
Order By: Relevance
“…We can see that the model with feature selection performs better than the model without feature selection. This finding is in accordance with previous works who stated that feature selection not only can facilitate the model's forecasting performance, but also can reduce the storage requirement (Peng & Fan, 2017;Zhang, Song, & Gong, 2017;Costea, Ferrara, & Şerban, 2017). Among all of the experiments, the model with FRST reaches the best forecasting quality.…”
Section: The Practical Results and Statistical Examinationsupporting
confidence: 91%
“…We can see that the model with feature selection performs better than the model without feature selection. This finding is in accordance with previous works who stated that feature selection not only can facilitate the model's forecasting performance, but also can reduce the storage requirement (Peng & Fan, 2017;Zhang, Song, & Gong, 2017;Costea, Ferrara, & Şerban, 2017). Among all of the experiments, the model with FRST reaches the best forecasting quality.…”
Section: The Practical Results and Statistical Examinationsupporting
confidence: 91%
“…Particularly, some references in very high-dimensional problems such as cancer detection via gene expression data, must be mentioned. The work of [36] presents a theoretical and practical framework for feature selection based on a conditional mutual information criterion. [35,50] focus on the chemotherapy effectiveness problems solved by means of ranking (SVM-RFE) and fuzzy if-then rules, respectively.…”
Section: Introductionmentioning
confidence: 99%
“…(1), instead of the conventional Frobenius loss functions (i.e., least square loss function), because the squared root of a loss function is more robust to outliers (Peng and Fan 2016, 2017a, 2017b). To optimize Eq.…”
Section: Methodsmentioning
confidence: 99%