2019
DOI: 10.3390/sym11121470
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection of Grey Wolf Optimizer Based on Quantum Computing and Uncertain Symmetry Rough Set

Abstract: Considering the crucial influence of feature selection on data classification accuracy, a grey wolf optimizer based on quantum computing and uncertain symmetry rough set (QCGWORS) was proposed. QCGWORS was to apply a parallel of three theories to feature selection, and each of them owned the unique advantages of optimizing feature selection algorithm. Quantum computing had a good balance ability when exploring feature sets between global and local searches. Grey wolf optimizer could effectively explore all pos… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 34 publications
0
4
0
Order By: Relevance
“…This algorithm [ 85 ] is inspired by the gray wolf's hunting policies and the leadership role. Gray wolves divide the population into four levels: alpha, beta, delta, and omega.…”
Section: Nature-inspired Algorithms For Feature Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…This algorithm [ 85 ] is inspired by the gray wolf's hunting policies and the leadership role. Gray wolves divide the population into four levels: alpha, beta, delta, and omega.…”
Section: Nature-inspired Algorithms For Feature Selectionmentioning
confidence: 99%
“…The searching task is done by following the location of alpha, beta, and delta. This algorithm [ 85 ] has been used for feature selection in COVID-19 detection.…”
Section: Nature-inspired Algorithms For Feature Selectionmentioning
confidence: 99%
“…Compare with Other Methods. We compare Hu's [19] neighborhood rough set algorithm based on heuristic information, FASRSR [13] and QCGWORS [20] based on swarm intelligence optimization algorithm. The experiment results of the best attribute subset by the four algorithms are shown in Table 4, and the classification accuracy is shown in Table 5.…”
Section: Experimental Performance Testmentioning
confidence: 99%
“…To solve the above drawbacks, AFE is a novel way to develop strategies for selecting useful subset features. It can find these useful features in datasets to generate a new set of features for clustering, and can promote the performance by deleting the irrelevant feature and the redundant features [24]. Moreover, it can initialize the number of clusters for clustering and adjust the value of parameters for classification.…”
Section: Introductionmentioning
confidence: 99%