2022
DOI: 10.1016/j.eswa.2022.116621
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid feature selection approach based on information theory and dynamic butterfly optimization algorithm for data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
24
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 54 publications
(24 citation statements)
references
References 46 publications
0
24
0
Order By: Relevance
“…Where 𝜒 𝐹 2 represents the chi-square value. Iman and Davenport show that Friedman's 𝜒 𝐹 2 is too conservative and come up with a better statistical formula, which is shown in equation (13).…”
Section: A Comparison Of Fafs_hfs With Other Feature Selection Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Where 𝜒 𝐹 2 represents the chi-square value. Iman and Davenport show that Friedman's 𝜒 𝐹 2 is too conservative and come up with a better statistical formula, which is shown in equation (13).…”
Section: A Comparison Of Fafs_hfs With Other Feature Selection Methodsmentioning
confidence: 99%
“…For example, scholar Jain and Singh introduced a two-phase hybrid feature selection method based on principal component analysis (PCA), ReliefF and adaptive support vector machine (SVM) [12]. Tiwari proposed a mixed information theory with dynamic butterfly optimization algorithm [13]. Uzer et al proposed a hybrid feature selection method based on sequential forward selection (SFS), sequential backward selection (SBS) and PCA [14].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently developed original algorithms include forensic-based investigation algorithm (FBI) 8 , the slime mould algorithm (SMA) 11 , the group teaching optimization algorithm (GTOA) 12 , dynamic group optimization (DGO) 13 , the African vultures optimization algorithm (AVOA) 14 , the Rao-3 algorithm 15 , the gorilla troops optimizer (GTO) 16 , smell agent optimization (SAO) 17 , the sparrow search algorithm (SSA) 18 , the artificial ecosystem optimizer (AEO) 19 , the starling murmuration optimizer (SMO) 20 , the dwarf mongoose optimization algorithm (DMOA) 21 , the war strategy optimization algorithm (WSOA) 22 , the dynamic butterfly optimization algorithm (DBOA) 23 , the artificial hummingbird optimization technique (AHOT) 24 , and the antlion optimization algorithm (ALOA) 25 .…”
Section: Introductionmentioning
confidence: 99%
“…Thus, finding a suitable set of predictors from the high-dimensional data, consist of physical factors, as the input of the runoff prediction model is still a challenge for medium and long-term runoff prediction. In information theory, mutual information (MI) method is a powerful tool for analyzing linear and nonlinear relationships between variables (Elkiran et al, 2021), but simply selecting predictors through correlation analysis provided by MI will introduce many redundant features (Tiwari and Chaturvedi 2022). To solve this problem, in traditional high-dimensional data preprocessing, principal component analysis (PCA) is typically used to reduce dimensionality and it can effectively reduce redundant variables and feature dimensions (Ouyang et al, 2022).…”
Section: Introductionmentioning
confidence: 99%