2010
DOI: 10.1142/s0218126610005937
|View full text |Cite
|
Sign up to set email alerts
|

Noncost Sensitive SVM Training Using Multiple Model Selection

Abstract: Abstract. In this paper, we propose a multi-objective optimization framework for SVM hyperparameters tuning. The key idea is to manage a population of classifiers optimizing both False Positive (FP) and True Positive (TP) rates rather than a single classifier optimizing a scalar criterion. Hence, each classifier in teh population optimizes a particular trade-off between the objectives. Within the context of two-class classification problems, our work introduces the "ROC front concept" depicting a population of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…The technique in [35] uses meta-learning, case-based reasoning to propose good staring points for evolutionary parameter optimization of SVM. The study in [36] presented the multi-objective optimization framework for tuning SVM hyper-parameters. More recent research [37] proposed the method for automatically picking out the kernel type (linear/non-linear) and tuning parameters.…”
Section: Discussion On Related Workmentioning
confidence: 99%
“…The technique in [35] uses meta-learning, case-based reasoning to propose good staring points for evolutionary parameter optimization of SVM. The study in [36] presented the multi-objective optimization framework for tuning SVM hyper-parameters. More recent research [37] proposed the method for automatically picking out the kernel type (linear/non-linear) and tuning parameters.…”
Section: Discussion On Related Workmentioning
confidence: 99%