2012 IEEE International Conference on Granular Computing 2012
DOI: 10.1109/grc.2012.6468599
|View full text |Cite
|
Sign up to set email alerts
|

Multiclass SVM with ramp loss for imbalanced data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…The classifier trained with imbalanced data set tend to bias towards the categories with more data and cannot achieve the best results. The higher the degree of data imbalance, the lower the classification accuracy of trained classifier will be [49]. However, the problems of utilizing oversampling techniques include generating unreal data and over-fitting.…”
Section: Smote-enn Methodsmentioning
confidence: 99%
“…The classifier trained with imbalanced data set tend to bias towards the categories with more data and cannot achieve the best results. The higher the degree of data imbalance, the lower the classification accuracy of trained classifier will be [49]. However, the problems of utilizing oversampling techniques include generating unreal data and over-fitting.…”
Section: Smote-enn Methodsmentioning
confidence: 99%
“…by assigning a higher cost for minority classes, which is similar to modify the loss function [22]). In this section, we propose to apply this idea to SVORIM, in order to compare the results obtained by the proposed over-sampling methods to a cost-sensitive approach.…”
Section: Cost-sensitive Svorim (Cs-svorim)mentioning
confidence: 98%
“…These functions, while yielding accurate result, still suffer from being computationally-expensive when solved as an optimization problem due to its non-convex nature. Alternatively, a truncated hinge loss or ramp loss function which is a non-smooth but continuous function has been proved to be accurate and efficient for SVM problem [26][27][28] . Due to its efficiency and simplicity, we decide to use the ramp loss to simulate 0-1 loss function in Eq.…”
Section: Multiclass Classification With Ramp Lossmentioning
confidence: 99%