2007 International Conference on Machine Learning and Cybernetics 2007
DOI: 10.1109/icmlc.2007.4370641
|View full text |Cite
|
Sign up to set email alerts
|

Improved SVM for Learning Multi-Class Domains with ROC Evaluation

Abstract: The area under the ROC curve (AUC) has been used as a criterion to measure the performance of classification algorithms even the training data embraces unbalanced class distribution and cost-sensitiveness. Support Vector Machine (SVM) is accepted to be a good classification algorithm in classification learning. This paper describes an improved SVM learning method, where RBF is used as its kernel function, and the parameters of RBF are optimized by genetic algorithm. Within the parameter optimization and SVM le… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0
1

Year Published

2012
2012
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 12 publications
0
4
0
1
Order By: Relevance
“…Model performance was evaluated using widely applied statistics, namely the area under the receiver-operator characteristics (ROC) curve or AUC statistic. The area under the ROC curve (AUC) has been used as a criterion to measure the performance of the classification algorithms even if the training data embraces an unbalanced class distribution and cost-sensitiveness [30]. In each class, the ROC curve applies the threshold values to the output values so that for each threshold, the true-positive ratio (TPR) and the false-positive ratio (FPR) values are simplified.…”
Section: Methodsmentioning
confidence: 99%
“…Model performance was evaluated using widely applied statistics, namely the area under the receiver-operator characteristics (ROC) curve or AUC statistic. The area under the ROC curve (AUC) has been used as a criterion to measure the performance of the classification algorithms even if the training data embraces an unbalanced class distribution and cost-sensitiveness [30]. In each class, the ROC curve applies the threshold values to the output values so that for each threshold, the true-positive ratio (TPR) and the false-positive ratio (FPR) values are simplified.…”
Section: Methodsmentioning
confidence: 99%
“…For constructing the validation set, we randomly select one sample of each subject from the remaining samples of all the trained subjects. We use the one-versus-all strategy to transform the multi-classification problem to several two-classification problems [28]. We repeat the above steps, adding occlusion to the validation set as Symmetry 2020, 12, 78 20 of 21 described in Sections 7.2 and 7.3, respectively.…”
Section: Cs Csmentioning
confidence: 99%
“…We repeat the above steps, adding occlusion to the validation set as Symmetry 2020, 12, 78 20 of 21 described in Sections 7.2 and 7.3, respectively. Finally, we perform constrained optimization using the GlobalSearch framework provided by MATLAB to maximize the average area under receiver operating characteristic (ROC) curve of transformed two-classification problems [28].…”
Section: Cs Csmentioning
confidence: 99%
“…However, extending the AUC to multiclass problem is still an open research topic. Directly calculating the Volume under ROC (VUC) surface [19] is complicated while many studies [20,21] tried to estimate the volume by projecting it into multiple 2-class dimensions.…”
Section: Volume Under Roc (Multiclass Auc)mentioning
confidence: 99%