2020
DOI: 10.1007/s11063-020-10307-7
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Method Based on Differential Correlation Information Entropy

Abstract: Feature selection is one of the major aspects of pattern classification systems. In previous studies, Ding and Peng recognized the importance of feature selection and proposed a minimum redundancy feature selection method to minimize redundant features for sequential selection in microarray gene expression data. However, since the minimum redundancy feature selection method is used mainly to measure the dependency between random variables of mutual information, the results cannot be optimal without considerati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…Information entropy belongs to the nonsequential index in the ideological and political teaching resource system [20,21]. If the entropy is larger, the disorder of ideological and political teaching resources is significant [22]. Information entropy can judge the uncertainty of ideological and political teaching resources.…”
Section: Redundancy Elimination Methods Of Ideological Andmentioning
confidence: 99%
“…Information entropy belongs to the nonsequential index in the ideological and political teaching resource system [20,21]. If the entropy is larger, the disorder of ideological and political teaching resources is significant [22]. Information entropy can judge the uncertainty of ideological and political teaching resources.…”
Section: Redundancy Elimination Methods Of Ideological Andmentioning
confidence: 99%
“…The evaluation criteria for the classification capabilities of the features mainly depend on the ability to distinguish between different categories (Dash and Liu, 1997). The separability of features can be divided into several groups, such as entropy function (Wang et al, 2020), Jeffreys-Matusita Distance (JM distance) (Sen et al, 2019), the relative distance between categories (Ell and Ashby, 2012), etc (Sweet, 2003). There have also been studies using Spectral Angular Distance (SAD) and Euclidean distance (ED) to measure the difference between two spectra in different years, which has proven to be the best similarity metric for detecting dual time variation (Huang et al, 2020;Ji et al, 2015).…”
Section: B Methodsmentioning
confidence: 99%
“…VMD determines the frequency center and bandwidth of each IMF by iteratively searching for the optimal solution of the variational model, thus adaptively realizing the frequency domain segmentation of the signal and IMFs. [ 23 ] Literature [ 24 ] focuses on load decomposition of industrial park load (IPL), an improved variational mode decomposition (IVMD) algorithm that denoises the training data of IPL to improve its stability.…”
Section: Principles Of Algorithmmentioning
confidence: 99%