2015
DOI: 10.1002/wics.1345
|View full text |Cite
|
Sign up to set email alerts
|

Distance‐weighted discrimination

Abstract: High Dimension Low Sample Size statistical analysis is becoming increasingly important in a wide range of applied contexts. In such situations, it is seen that the popular Support Vector Machine suffers from "data piling" at the margin, which can diminish generalizability. This leads naturally to the development of Distance Weighted Discrimination, which is based on Second Order Cone Programming, a modern computationally intensive optimization method.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

3
204
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 89 publications
(207 citation statements)
references
References 18 publications
3
204
0
Order By: Relevance
“…For the classifiers based on Markov models considered in Section 2, the computation of the maximum likelihood estimates of parameters is fairly straight-forward; classifiers based on RD are computationally far less attractive than those based on Markov models. Methods based on support vector machines (SVM) (see Vapnik (1998);Hastie, Tibshirani, and Friedman (2009)) and distance weighted discrimination (DWD) (see Marron, Todd, and Ahn (2007);Qiao et al (2010)) are two well-known classification techniques available in the literature that are well equipped to deal with high-dimensional data sets, and it is appropriate to consider linear classifiers based on T x (k) using SVM and DWD as alternatives to RD.…”
Section: Other Linear Classifiersmentioning
confidence: 99%
“…For the classifiers based on Markov models considered in Section 2, the computation of the maximum likelihood estimates of parameters is fairly straight-forward; classifiers based on RD are computationally far less attractive than those based on Markov models. Methods based on support vector machines (SVM) (see Vapnik (1998);Hastie, Tibshirani, and Friedman (2009)) and distance weighted discrimination (DWD) (see Marron, Todd, and Ahn (2007);Qiao et al (2010)) are two well-known classification techniques available in the literature that are well equipped to deal with high-dimensional data sets, and it is appropriate to consider linear classifiers based on T x (k) using SVM and DWD as alternatives to RD.…”
Section: Other Linear Classifiersmentioning
confidence: 99%
“…In a recent extensive numerical study by Fernández-Delgado et al (2014), the kernel SVM was shown to be one of the best among 179 commonly used classifiers. Marron et al (2007) invented a new classification algorithm named distance-weighted discrimination (DWD), which retains the elegant geometric interpretation of the SVM, resolves a 'data piling' issue and reveals competitive performance. Since then much work has been devoted to the development of DWD.…”
Section: Introductionmentioning
confidence: 99%
“…Since then much work has been devoted to the development of DWD. Readers are referred to Marron (2015) for an up-to-date list of work on DWD. However, DWD is still only known to a small group of researchers.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the literature, there are a large number of classifiers available, for example, linear discriminant analysis [1], support vector machines (SVMs) [2], Random Forests [3], distance-weighted discrimination (DWD) [4], and large margin unified machines (LUMs) [5]. The study by Hastie et al [6] provides a comprehensive review of many machine learning techniques.…”
Section: Introductionmentioning
confidence: 99%