2005
DOI: 10.1016/j.neucom.2004.11.022
|View full text |Cite
|
Sign up to set email alerts
|

Evolutionary tuning of multiple SVM parameters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
208
0

Year Published

2006
2006
2021
2021

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 448 publications
(218 citation statements)
references
References 13 publications
1
208
0
Order By: Relevance
“…Multi-scale kernels have been mainly used with evolutionary algorithms [19,32,17] or gradient-based methods for specific applications [36,6,24]. The main problem with evolutionary approaches is the high computational cost and the necessity of tuning a large number of parameters associated to the algorithm.…”
Section: Multi-scale Casementioning
confidence: 99%
See 1 more Smart Citation
“…Multi-scale kernels have been mainly used with evolutionary algorithms [19,32,17] or gradient-based methods for specific applications [36,6,24]. The main problem with evolutionary approaches is the high computational cost and the necessity of tuning a large number of parameters associated to the algorithm.…”
Section: Multi-scale Casementioning
confidence: 99%
“…The general motivation for the use of multi-scale kernels is that, in real-world applications, the attributes can present very different nature, which hampers the performance of spherical kernels (i.e., with the same kernel width for each attribute) [24,17]. However, the number of parameters (as many as the number of features) makes the computational cost prohibitive when considering a cross-validation technique.…”
Section: Introductionmentioning
confidence: 99%
“…The Covariance Matrix Adaptation Evolution Strategy (Hansen and Ostermeier 2001) is a modification of standard genetic algorithm that updates the covariance matrix used for selecting the next generation. Friedrichs and Igel (2005) used this method to optimize multiple parameters of Support Vector Machines (SVM). Merelo et al (2001) presented G-LVQ, which uses a genetic algorithm to optimize Learning Vector Quantization (Kohonen 1998).…”
Section: Introductionmentioning
confidence: 99%
“…Several methods for model selection in SVM and SVM-like models were proposed, e.g. in [9,10,6,7,11,8]. The popular way is application of Bayesian learning framework and maximal evidence principle [3].…”
Section: Introductionmentioning
confidence: 99%