2017
DOI: 10.1111/rssb.12244
|View full text |Cite
|
Sign up to set email alerts
|

Another Look at Distance-Weighted Discrimination

Abstract: Summary. Distance-weighted discrimination (DWD) is a modern margin-based classifier with an interesting geometric motivation. It was proposed as a competitor to the support vector machine (SVM). Despite many recent references on DWD, DWD is far less popular than the SVM, mainly because of computational and theoretical reasons. We greatly advance the current DWD methodology and its learning theory. We propose a novel thrifty algorithm for solving standard DWD and generalized DWD, and our algorithm can be severa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
37
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 41 publications
(37 citation statements)
references
References 40 publications
0
37
0
Order By: Relevance
“…In this section, we develop an sGS-ADMM [11,33] to solve the dual of the CTNN model in (3.2). Extensive numerical experiments in many fields demonstrate that the sGS-ADMM is not only convergent but also faster than the possibly nonconvergent directly extended multiblock ADMM and its many other variants, e.g., see [3,11,12,30,33,59] and references therein.…”
mentioning
confidence: 99%
“…In this section, we develop an sGS-ADMM [11,33] to solve the dual of the CTNN model in (3.2). Extensive numerical experiments in many fields demonstrate that the sGS-ADMM is not only convergent but also faster than the possibly nonconvergent directly extended multiblock ADMM and its many other variants, e.g., see [3,11,12,30,33,59] and references therein.…”
mentioning
confidence: 99%
“…For comparisons, we also supply the estimated β 0 , β and ∇ from the CLIPS procedure to a QDA classifier which is applied to all the observations in a testing set, followed by a majority voting scheme (labeled as QDA-MV). Lastly, we calculate the sample mean and variance of each variable in an observation set to form a new feature vector as done in Miedema et al (2012); then support vector machine (SVM; Cortes and Vapnik, 1995) and distance weighted discrimination (DWD; Marron et al, 2007;Wang and Zou, 2018) are applied to the features to make predictions (labeled as SVM and DWD respectively). We use R library clime to calculate the CLIME estimates, R library e1071 to calculate the SVM classifier, and R library sdwd (Wang and Zou, 2016) to calculate the DWD classifier.…”
Section: Numerical Studiesmentioning
confidence: 99%
“…The slack variable η i is introduced to ensure that the corresponding margin d i is non-negative and the constant c > 0 is a tuning parameter to control the overlap between classes. Problem (1) can also be written in a loss-plus-penalty form (e.g., [12]) as…”
Section: Introductionmentioning
confidence: 99%
“…In binary classification problems, linear SVMs seek a hyperplane maximizing the smallest margin for all data points, while DWD seeks a hyperplane minimizing the sum of inverse margins over all data points. Reference [ 8 ] suggests replacing the inverse margins by the q -th power of the inverse margins in a generalized DWD method; see [ 12 ] for a detailed description. Formally, for a training data set of N observations, where and , binary generalized linear DWD seeks a proper separating hyperplane through the optimization problem where a and are the intercept and slope parameters, respectively.…”
Section: Introductionmentioning
confidence: 99%