2022
DOI: 10.1016/j.knosys.2022.108517
|View full text |Cite
|
Sign up to set email alerts
|

A dynamic stochastic search algorithm for high-dimensional optimization problems and its application to feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 96 publications
0
3
0
Order By: Relevance
“…Therefore, the complexity can be mathematically represented as O ( C QL + C QLESCA − SCNN + C SVM ), where O denotes the worst-case time complexity, and C QL , C QLESCA-SCNN , and C SVM indicate the complexity of Q-Learning implementation while modifying the location of each QLESCA search agent, the QLESCA-SCNN feature selection method, and the execution time of the SVM classifier in the training phase, respectively. Determining the computational complexity of many metaheuristic algorithms typically involves the analyses of three components [ 40 ]:…”
Section: Proposed Frameworkmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, the complexity can be mathematically represented as O ( C QL + C QLESCA − SCNN + C SVM ), where O denotes the worst-case time complexity, and C QL , C QLESCA-SCNN , and C SVM indicate the complexity of Q-Learning implementation while modifying the location of each QLESCA search agent, the QLESCA-SCNN feature selection method, and the execution time of the SVM classifier in the training phase, respectively. Determining the computational complexity of many metaheuristic algorithms typically involves the analyses of three components [ 40 ]:…”
Section: Proposed Frameworkmentioning
confidence: 99%
“… [ 39 ] group teaching optimization algorithm 2022 14. [ 40 ] dynamic stochastic search algorithm 2022 15. [ 41 ] Dynamic Butterfly algorithm 2022 16.…”
Section: Introductionmentioning
confidence: 99%
“…The suggested model performance assessed using 12 datasets from the UCI repository. The other work, the authors (Liu et al, 2022) suggested a dynamic stochastic search (DSS) for FS problems. It is exclusively designed for high dimensional optimization problems.…”
Section: Related Workmentioning
confidence: 99%
“…They are effective in resolving high-dimensional, NP-hard, non-differentiable, non-convex optimization issues [9]. Among the benefits that have contributed to the success of metaheuristic algorithms are their effectiveness in solving unknown search spaces, nonlinear, discrete, and the simplicity of their principles, and their independence, their ease of implementation from the nature of the issue [10].…”
Section: Introductionmentioning
confidence: 99%
“…As explained in [14,15], the genetic algorithm and the evaluation strategy [16], evolutionary algorithms have been built by modelling biological evolutionary traits such as crossovers, mutations, and selections. Physics-based algorithms are motivated by physical laws, such as the equilibrium algorithm (EA) [10,13] and the Henry gas solubility algorithm [12]. Swarm intelligence algorithms, such as the whale optimization algorithm (WOA) [17], jellyfish search optimization (JFSO) [18], heap-based technique (HT) [19], grasshopper optimization (GO) [20], particle swarm optimization [21], manta rays foraging optimization (MRFO) [22], artificial bee colony [23], and marine predators algorithm (MPA) [24] are a series of algorithms influenced by swarming and animal group behavior.…”
Section: Introductionmentioning
confidence: 99%