2022
DOI: 10.3390/math10193606
|View full text |Cite
|
Sign up to set email alerts
|

A Quantum-Based Chameleon Swarm for Feature Selection

Abstract: The Internet of Things is widely used, which results in the collection of enormous amounts of data with numerous redundant, irrelevant, and noisy features. In addition, many of these features need to be managed. Consequently, developing an effective feature selection (FS) strategy becomes a difficult goal. Many FS techniques, based on bioinspired metaheuristic methods, have been developed to tackle this problem. However, these methods still suffer from limitations; so, in this paper, we developed an alternativ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 67 publications
0
3
0
Order By: Relevance
“…It has been analyzed that certain limitations of classical computations can be addressed by applying quantum technologies (QT). Therefore, various studies incorporate quantum technologies such as quantum transfer learning and quantumbased chameleon swarm quantum annealing for FE and FS methodologies [27][28][29]. In 2023, research has been published that shows how a quantum computation (QC) based feature selection method can reduce the model complexity and improve the ML interpretability.…”
Section: Related Workmentioning
confidence: 99%
“…It has been analyzed that certain limitations of classical computations can be addressed by applying quantum technologies (QT). Therefore, various studies incorporate quantum technologies such as quantum transfer learning and quantumbased chameleon swarm quantum annealing for FE and FS methodologies [27][28][29]. In 2023, research has been published that shows how a quantum computation (QC) based feature selection method can reduce the model complexity and improve the ML interpretability.…”
Section: Related Workmentioning
confidence: 99%
“…PCA reduces the dimensionality of the predictor space, creating classification models that prevent overfitting. PCA linearly transforms predictors to remove redundant dimensions and generates a new set of variables called principal components [22], [23].…”
Section: Apreprocessingmentioning
confidence: 99%
“…Many efforts have been made to improve the performance of emotional state recognition in speech through feature selection. The main aim of feature selection is to choose the most important acoustic features, which can reduce the computational cost of SERs and improve their recognition accuracy [16][17][18].…”
Section: Introductionmentioning
confidence: 99%