2008
DOI: 10.1016/j.eswa.2007.08.088
|View full text |Cite
|
Sign up to set email alerts
|

Particle swarm optimization for parameter determination and feature selection of support vector machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
339
0
4

Year Published

2011
2011
2021
2021

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 795 publications
(343 citation statements)
references
References 25 publications
0
339
0
4
Order By: Relevance
“…This happens when searching the optimal pair value on polynomial kernel that requires computation time is relatively longer than other kernels, since this kernel has four parameters that are necessary to be optimized. While, if too small, a search region might render a satisfactory outcome impossible [43]. Therefore, we have precisely analyzed the search region first before performing the optimization simulation.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…This happens when searching the optimal pair value on polynomial kernel that requires computation time is relatively longer than other kernels, since this kernel has four parameters that are necessary to be optimized. While, if too small, a search region might render a satisfactory outcome impossible [43]. Therefore, we have precisely analyzed the search region first before performing the optimization simulation.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…Moreover, binary PSO [225], which is a discrete optimization method was employed for selecting the input features which were binary coded [226,227]. In [226], a modified version of binary PSO was proposed where EA like mutation operator was applied to mutated binary vectors.…”
Section: Input Layer Optimizationmentioning
confidence: 99%
“…In [226], a modified version of binary PSO was proposed where EA like mutation operator was applied to mutated binary vectors. Similarly, ACO, which traditional solves discrete optimization problem was applied to select input features and training of an FNN in a hybrid manner [228].…”
Section: Input Layer Optimizationmentioning
confidence: 99%
“…This problem is noisy and multimodal, which makes gradient descent algorithms fail. For these reasons, stochastic search algorithms have been preferred by other researchers: simulated annealing in [15,26], particle swarm optimization method in [27,28], comprehensive learning particle swarm optimization combined with a BFGS algorithm in [29]). The present paper is based on the stochastic search algorithm known as the cross-entropy method (CEM) whose efficiency has been observed in several works.…”
Section: Selection Of the Svr Surrogate Model Parametersmentioning
confidence: 99%