2012
DOI: 10.1093/biomet/ass029
|View full text |Cite
|
Sign up to set email alerts
|

Bidirectional discrimination with application to data visualization

Abstract: SUMMARYLinear classifiers are very popular, but can have limitations when classes have distinct subpopulations. General nonlinear kernel classifiers are very flexible, but do not give clear interpretations and may not be efficient in high dimensions. We propose the bidirectional discrimination classification method, which generalizes linear classifiers to two or more hyperplanes. This new family of classification methods gives much of the flexibility of a general nonlinear classifier while maintaining the inte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 11 publications
(13 citation statements)
references
References 15 publications
0
13
0
Order By: Relevance
“…(). Interpretability is generally difficult with SVMs, except for the linear kernel SVM and its linear extensions (Huang et al., ). The restriction in interpretability and the need to transfer data for nearest neighbor approaches could easily lead to the decision that only logreg and RF are used for an application.…”
Section: Discussionmentioning
confidence: 99%
“…(). Interpretability is generally difficult with SVMs, except for the linear kernel SVM and its linear extensions (Huang et al., ). The restriction in interpretability and the need to transfer data for nearest neighbor approaches could easily lead to the decision that only logreg and RF are used for an application.…”
Section: Discussionmentioning
confidence: 99%
“…However, the combinations of the two variables obtained with multidirectional classifiers provide a better result. This is somehow natural because one group has clusters and it has already been observed in Huang et al (2012) that products of two hyperplanes might produce good results in this situation. When the dimension of the classification problem increases, the multidirectional classifier (based on QSVM) still manages to find the best two directions to look at the data.…”
Section: Discussionmentioning
confidence: 99%
“…The ideas in this paper are related to those in Huang et al (2012), where the authors look for two discriminative directions by solving an optimization problem subject to certain restrictions to deal with distinct subpopulations in the groups. Though the present approach is completely different, the methodology proposed in this paper can also work under the presence of different clusters within the groups (see the example in Section 4).…”
Section: Figurementioning
confidence: 99%
See 1 more Smart Citation
“…Among various classification tools, linear classifiers are popular especially for high dimensional problems, due to their simplicity and good interpretability. While widely used, linear classifiers can be suboptimal for many practical problems [7]. One main practical problem we are interested in this paper is classification in the presence of latent subgroups.…”
Section: Introductionmentioning
confidence: 99%