2008
DOI: 10.1214/07-ejs125
|View full text |Cite
|
Sign up to set email alerts
|

Structured variable selection in support vector machines

Abstract: When applying the support vector machine (SVM) to high-dimensional classification problems, we often impose a sparse structure in the SVM to eliminate the influences of the irrelevant predictors. The lasso and other variable selection techniques have been successfully used in the SVM to perform automatic variable selection. In some problems, there is a natural hierarchical structure among the variables. Thus, in order to have an interpretable SVM classifier, it is important to respect the heredity principle wh… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 29 publications
0
7
0
Order By: Relevance
“…Structured Perceptron [6] and structured SVM [7] based approaches only require efficiently compute the arg max, unlike CRFs which require computing many other marginals and partition function. These approaches are non-probabilistic, and they utilize a simple form for the score:…”
Section: Hmms Memms Crfs and Structured Perceptronmentioning
confidence: 99%
“…Structured Perceptron [6] and structured SVM [7] based approaches only require efficiently compute the arg max, unlike CRFs which require computing many other marginals and partition function. These approaches are non-probabilistic, and they utilize a simple form for the score:…”
Section: Hmms Memms Crfs and Structured Perceptronmentioning
confidence: 99%
“…Various variable selection methods were studied in recent years to encourage sparsity in SVMs (Cortes & Vapnik, 1995). References include Zhu et al (2003), Wu et al (2008) and Park et al (2012). Our work can be viewed as a penalized weighted SVM.…”
Section: Introductionmentioning
confidence: 99%
“…Exploring the relationship between features is not new. Recently in structured feature selection, supervised learning algorithms have been explored for data sets where features have some natural "structure" relationships [27,28,29,30,31]. For example Yuan and Lin explored the situation where features may be naturally partitioned into groups and studied the regression problem of grouped features using a technique called grouped Lasso [28].…”
Section: Introductionmentioning
confidence: 99%
“…Another possible type of structure relationship of features is a hierarchical relation (i.e. a directed acyclic graph defined on features) and that has been explored in [27,30]. In [29], both group structure and hierarchical relation have been studied in a unified framework.…”
Section: Introductionmentioning
confidence: 99%