2013
DOI: 10.5351/csam.2013.20.6.481
|View full text |Cite
|
Sign up to set email alerts
|

Weighted Support Vector Machines with the SCAD Penalty

Abstract: Classification is an important research area as data can be easily obtained even if the number of predictors becomes huge. The support vector machine(SVM) is widely used to classify a subject into a predetermined group because it gives sound theoretical background and better performance than other methods in many applications. The SVM can be viewed as a penalized method with the hinge loss function and penalty functions. Instead of L 2 penalty function Fan and Li (2001) proposed the smoothly clipped absolute d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…where V is the weight for the th observation with the th class. A weighted SVM is proposed for the robustness of the SVM which is not sensitive to outlier or leverage points (see [13]). We consider the weight for each class as…”
Section: Svm For Unbalanced Casesmentioning
confidence: 99%
See 1 more Smart Citation
“…where V is the weight for the th observation with the th class. A weighted SVM is proposed for the robustness of the SVM which is not sensitive to outlier or leverage points (see [13]). We consider the weight for each class as…”
Section: Svm For Unbalanced Casesmentioning
confidence: 99%
“…The within group errors can be calculated as the misclassification rate for the th class. Weight (13) gives much more weights on the minority class and the well-classified group got the less weight. The larger values of | (x )| in (13) represent well-classified observations.…”
Section: Svm For Unbalanced Casesmentioning
confidence: 99%
“…Some work has been done to extend the smoothly clipped absolute deviation (SCAD) penalty with weighted linear SVMs with special forms of such weights (see Jung (2013)), but, beyond that, there has not been any targeted investigation of such to our knowledge. These two reasons make these explorations vitally important.…”
Section: Introductionmentioning
confidence: 99%
“…However, it is worth noting that our setting differs from a simple classification format in two vital aspects: (a) although the treatment selection objective can be rewritten into a (weighted) classification problem (as shown in Section 2), it is still in essence a fundamentally different problem from classification, and feature selection techniques in SVMs have not been studied under this context, and (b) weighted SVM is a more complicated optimization problem than the standard SVM, where the constraint on each support vector varies according to the weight associated with it, and research into feature extraction under this setting has also been fairly limited till now. Some work has been done to extend the SCAD penalty with the weighted linear support vector machines with special forms of such weights (see Jung, 2013), but beyond that, there hasn't been any targeted investigation of such as per our knowledge. These two reasons make these explorations vitally important.…”
Section: Introductionmentioning
confidence: 99%
“…One alternative is the least absolute deviation (LAD) estimate. Jung (2011Jung ( , 2013 proposed robust estimators and outlier detection methods in regression models and support vector machine. There are several robust versions of LASSO.…”
Section: Introductionmentioning
confidence: 99%