2022
DOI: 10.1155/2022/7584374
|View full text |Cite
|
Sign up to set email alerts
|

Feature Screening via Mutual Information Learning Based on Nonparametric Density Estimation

Abstract: With the advent of the era of big data, feature selection in high- or ultra-high-dimensional data is increasingly important in statistics and machine learning fields. In this paper, we propose a marginal utility measure screening method MI-SIS based on mutual information. The proposed marginal utility measure has several appealing features compared with the existing independence screening methods. Firstly, the proposed procedure is model-free without specifying any relationship between the predictors and the r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…In the subsequent sections, we will establish the theoretical properties of the proposed independence screening procedure. Previous studies by Fan and Lv [1], Ji and Jin [22], Zhou and Wang [20] and Ni and Fang [16] have demonstrated that the sure screening property ensures the effectiveness of the independence screening procedure. Hence, it is crucial to establish the sure screening property for MIC-SIS.…”
Section: Feature Screening Propertymentioning
confidence: 98%
See 2 more Smart Citations
“…In the subsequent sections, we will establish the theoretical properties of the proposed independence screening procedure. Previous studies by Fan and Lv [1], Ji and Jin [22], Zhou and Wang [20] and Ni and Fang [16] have demonstrated that the sure screening property ensures the effectiveness of the independence screening procedure. Hence, it is crucial to establish the sure screening property for MIC-SIS.…”
Section: Feature Screening Propertymentioning
confidence: 98%
“…The fundamental principle underlying the maximal information coefficient (MIC) is based on the concept of mutual information. Mutual Information (MI) [20] is a valuable measure in information theory, quantifying the amount of information contained in one random variable regarding another random variable. It represents the reduction in uncertainty of a random variable due to the knowledge of another random variable.…”
Section: Maximal Information Coefficient (Mic)mentioning
confidence: 99%
See 1 more Smart Citation