2011
DOI: 10.1007/s00521-011-0603-9
|View full text |Cite
|
Sign up to set email alerts
|

Parameter selection of support vector machines and genetic algorithm based on change area search

Abstract: Generalization performance of support vector machines (SVM) with Gaussian kernel is influenced by its model parameters, both the error penalty parameter and the Gaussian kernel parameter. After researching the characteristics and properties of the parameter simultaneous variation of support vector machines with Gaussian kernel by the parameter analysis table, a new area distribution model is proposed, which consists of optimal straight line, reference point of area boundary, optimal area, transition area, unde… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(13 citation statements)
references
References 10 publications
0
13
0
Order By: Relevance
“…Here ε-SVR was used with the radial basis function (RBF) as the kernel function. The generalization ability of SVR is affected by 2 parameters, the punishment factor C (C > 0) and the kernel parameter σ 2 [60]. A larger C means the model has less tolerance for errors, leading to overfitting; a smaller C means the model is prone to underfitting.…”
Section: Methodsmentioning
confidence: 99%
“…Here ε-SVR was used with the radial basis function (RBF) as the kernel function. The generalization ability of SVR is affected by 2 parameters, the punishment factor C (C > 0) and the kernel parameter σ 2 [60]. A larger C means the model has less tolerance for errors, leading to overfitting; a smaller C means the model is prone to underfitting.…”
Section: Methodsmentioning
confidence: 99%
“…Step 7-Update the current particle speed and location, generate a new generation of population by Equations (18) and (20). Step 8-If the stop condition is met, the iteration is terminated; otherwise, return to step3.…”
Section: Parameter Combinations Of Slssvr Optimized By Psomentioning
confidence: 99%
“…But these methods are inefficient and easy to fall into the local optimal. Zhao et al used the gradient descent method to search the optimal parameters. But the performance of this method is sensitive to the initialization, and it is easy to fall into the local optimal.…”
Section: Introductionmentioning
confidence: 99%
“…Different from conventional heuristic methods, the GA is a general adaptive optimization searching method based on the Darwinian principle of ''survival of the fittest'' [55][56][57].…”
Section: Genetic Algorithm (Ga)mentioning
confidence: 99%