Computational and Ambient Intelligence
DOI: 10.1007/978-3-540-73007-1_35
|View full text |Cite
|
Sign up to set email alerts
|

Tuning L1-SVM Hyperparameters with Modified Radius Margin Bounds and Simulated Annealing

Abstract: Abstract. In the design of support vector machines an important step is to select the optimal hyperparameters. One of the most used estimators of the performance is the Radius-Margin bound. Some modifications of this bound have been made to adapt it to soft margin problems, giving a convex optimization problem for the L2 soft margin formulation. However, it is still interesting to consider the L1 case due to the reduction in the support vector number. There have been some proposals to adapt the Radius-Margin b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…The Bessel kernel requires the optimization of 3 hyper-parameters, thus grid search should be performed in a 3-dimensional space, which makes the grid search cumbersome. To avoid performing a full grid search, a Simulated Annealing (SA) approach (Kirkpatrick et al, 1983;Acevedo et al, 2007), can be applied to parameters optimization. The SA approach although time consuming is faster than a complete grid search, thus reducing the time required to fit the model.…”
Section: Choosing Hyper-parametersmentioning
confidence: 99%
“…The Bessel kernel requires the optimization of 3 hyper-parameters, thus grid search should be performed in a 3-dimensional space, which makes the grid search cumbersome. To avoid performing a full grid search, a Simulated Annealing (SA) approach (Kirkpatrick et al, 1983;Acevedo et al, 2007), can be applied to parameters optimization. The SA approach although time consuming is faster than a complete grid search, thus reducing the time required to fit the model.…”
Section: Choosing Hyper-parametersmentioning
confidence: 99%
“…While some techniques are algorithm-specific such as [14] and [15], some global optimization approaches can be applied to any algorithms, including collaborative filtering algorithms. Two widely used global optimization approaches are grid search and random search [8].…”
Section: Automatic Hyperparameter Optimizationmentioning
confidence: 99%