2008
DOI: 10.1016/j.asoc.2007.10.012
|View full text |Cite
|
Sign up to set email alerts
|

Parameter determination of support vector machine and feature selection using simulated annealing approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
116
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 299 publications
(116 citation statements)
references
References 34 publications
0
116
0
Order By: Relevance
“…This approach, however, involves the need of an expert whose knowledge can be incorporated into the system capable of making accurate forecasts. The most popular algorithms, according to their efficiency, are Support Vector Machines (SVM) [6,7] and Neural Networks (NN) [8,9]. Note that in this paper we will only use SVM since we have not been able to replicate the results from the literature involving NNs.…”
Section: Introductionmentioning
confidence: 75%
“…This approach, however, involves the need of an expert whose knowledge can be incorporated into the system capable of making accurate forecasts. The most popular algorithms, according to their efficiency, are Support Vector Machines (SVM) [6,7] and Neural Networks (NN) [8,9]. Note that in this paper we will only use SVM since we have not been able to replicate the results from the literature involving NNs.…”
Section: Introductionmentioning
confidence: 75%
“…The basic concept behind SVM is to map the original data sets to higher dimensional features of space and construct an optimal separating plane (SP), from which the distance to all the data points is minimal (Lin et al 2008;Pan et al 2008;Qu and Zuo 2010). Detailed expression of SVM has been extensively reported in numerous studies (Cristianine and Taylor 2000;Raghavendra and Deka, 2014;Vapnik 1998).…”
Section: Support Vector Machinementioning
confidence: 99%
“…This problem is noisy and multimodal, which makes gradient descent algorithms fail. For these reasons, stochastic search algorithms have been preferred by other researchers: simulated annealing in [15,26], particle swarm optimization method in [27,28], comprehensive learning particle swarm optimization combined with a BFGS algorithm in [29]). The present paper is based on the stochastic search algorithm known as the cross-entropy method (CEM) whose efficiency has been observed in several works.…”
Section: Selection Of the Svr Surrogate Model Parametersmentioning
confidence: 99%