2021
DOI: 10.1007/s00500-021-06375-z
|View full text |Cite
|
Sign up to set email alerts
|

Efficient feature selection for inconsistent heterogeneous information systems based on a grey wolf optimizer and rough set theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 40 publications
0
5
0
Order By: Relevance
“…The position vector specifies the variable values for the problem under study. For example, M 32 = [5,10,7,9] means that the problem is of 4 variables and whose values, for monkeypox copy number 3 at generation 2, are 5, 10, 7 and 9, respectively. At first (generation j = 1), the virus searches for the desired cell and then performs the rest of the steps.…”
Section: Monkeypox Optimization (Mo) Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…The position vector specifies the variable values for the problem under study. For example, M 32 = [5,10,7,9] means that the problem is of 4 variables and whose values, for monkeypox copy number 3 at generation 2, are 5, 10, 7 and 9, respectively. At first (generation j = 1), the virus searches for the desired cell and then performs the rest of the steps.…”
Section: Monkeypox Optimization (Mo) Algorithmmentioning
confidence: 99%
“…Yet, another direction is to mimic how natural swarms like birds, fish, and insects work collaboratively to find a food source. Examples in this direction are particle swarm optimization [3], artificial bee colony [4], grey wolf optimization [5] and whale optimization algorithm (WOA) [6]. One more direction is to mimic the natural physical phenomena involving the physics of matter and energy.…”
Section: Introductionmentioning
confidence: 99%
“…Feature selection is employed to discern significant features from inconsequential features within a predetermined feature collection ( Hamed & Mohamed, 2023 ; Hamed & Nassar, 2021 ; Ablel-Rheem et al, 2020 ). Feature selection aims to minimize the size of high-dimensional classification issues while improving prediction accuracy in classification problems ( Sağbaş & Ballı, 2024 ).…”
Section: Introductionmentioning
confidence: 99%
“…Mobile continuous authentication models typically employ feature selection to enhance the accuracy of machine learning-based biometric authentication for smartphone users ( Hamed & Nassar, 2021 ). The literature also emphasizes the use of bioinspired feature extraction algorithms such as Grey Wolf Optimization (GWO) ( Almazroi & Eltoukhy, 2023 ), Particle Swarm Optimization (PSO) ( Rostami et al, 2020 ), Whale Optimization Algorithm (WOA) ( Mirjalili & Lewis, 2016 ), Harris Hawks Optimization (HHO) ( Heidari et al, 2019 ), and Bayesian Optimization Algorithm (BOA) ( Yang, Liu & Wen, 2024 ).…”
Section: Introductionmentioning
confidence: 99%
“…Feature selection is utilized to identify important features from irrelevant features of a predefined feature set [1,2]. The key objectives of feature selection are to reduce data dimensionality and improve prediction performance.…”
Section: Introductionmentioning
confidence: 99%