2006 Sixth International Conference on Hybrid Intelligent Systems (HIS'06) 2006
DOI: 10.1109/his.2006.264914
|View full text |Cite
|
Sign up to set email alerts
|

Multiclass SVM Model Selection Using Particle Swarm Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
29
0
6

Year Published

2007
2007
2019
2019

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(36 citation statements)
references
References 15 publications
1
29
0
6
Order By: Relevance
“…Still, some comparisons can be made. Souza et al [6] report a classification accuracy of 0.9882 using k-fold validation on the training set for the Vowel dataset. KGP returns an average accuracy of 0.9981 on the training set.…”
Section: Experiments and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Still, some comparisons can be made. Souza et al [6] report a classification accuracy of 0.9882 using k-fold validation on the training set for the Vowel dataset. KGP returns an average accuracy of 0.9981 on the training set.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Phienthrakul and Kilsirikul [20] used ES to learn the weights in a weighted linear combination of Gaussian radial basis functions. Souza et al [6] use Particle Swarm Optimization (PSO) to learn optimal parameters in a Gaussian kernel function for multi-class classification, while Huang and Wang use a GA for the same task [13,12]. Runarsson and Sigurdsson [21] use a parallel ES to learn optimal parameters in a Gaussian kernel.…”
Section: Introductionmentioning
confidence: 99%
“…It can be applied to other parametric classifiers (KNN, Neural network, etc.) with other optimization methods ( [9]). Moreover, it can be easily extended through the introduction of other parameters (kernel type) or objectives (number of support vectors, decision time).…”
Section: Discussionmentioning
confidence: 99%
“…In order to automatize this process and to avoid an exhaustive or a random exploration of parameters, different authors have deployed search and optimization techniques [3,4,6,7,8,13,14,15]. In this context, the search space consists on a set of possible configurations of parameters and the objective function corresponds to a performance measure (e.g., precision estimated by cross-validation) obtained by the SVM on the problem.…”
Section: Svm Parameter Selectionmentioning
confidence: 99%