2008
DOI: 10.1016/j.neucom.2008.04.027
|View full text |Cite
|
Sign up to set email alerts
|

A novel LS-SVMs hyper-parameter selection based on particle swarm optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
61
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 176 publications
(62 citation statements)
references
References 30 publications
1
61
0
Order By: Relevance
“…In order to automatize this process and to avoid an exhaustive or a random exploration of parameters, different authors have deployed search and optimization techniques [3,4,6,7,8,13,14,15]. In this context, the search space consists on a set of possible configurations of parameters and the objective function corresponds to a performance measure (e.g., precision estimated by cross-validation) obtained by the SVM on the problem.…”
Section: Svm Parameter Selectionmentioning
confidence: 99%
“…In order to automatize this process and to avoid an exhaustive or a random exploration of parameters, different authors have deployed search and optimization techniques [3,4,6,7,8,13,14,15]. In this context, the search space consists on a set of possible configurations of parameters and the objective function corresponds to a performance measure (e.g., precision estimated by cross-validation) obtained by the SVM on the problem.…”
Section: Svm Parameter Selectionmentioning
confidence: 99%
“…Advantages of PSO based search are mainly the simplicity of the search process, ease of implementation that is demonstrated by the small number of parameters to be adjusted during initialization and search process. PSO also accomplishes an optimization task without mutation and cross-over operations, which are essential steps of GAs [17].…”
Section: Particle Swarm Optimizationmentioning
confidence: 99%
“…This least square formulation [15,16] not only significantly reduces complexity of solving SVM learning problem but also provides an efficient mean for estimating the generalization ability of a nonlinear SVM classifier and for accomplishing feature selection using heuristic methods [17] more efficiently.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, GA and SA are both complex. Peng proposed a selection method based on particle swarm algorithm (PSO) [9], which is easy to implement and has relatively high optimization efficiency. However, the drawback of PSO is that it may get trapped in local optima.…”
Section: Introductionmentioning
confidence: 99%