2013
DOI: 10.1109/tkde.2011.247
|View full text |Cite
|
Sign up to set email alerts
|

Cutting Plane Training for Linear Support Vector Machines

Abstract: Support Vector Machines (SVMs) have been shown to achieve high performance on classification tasks across many domains, and a great deal of work has been dedicated to developing computationally efficient training algorithms for linear SVMs. One approach [1] approximately minimizes risk through use of cutting planes, and is improved by [2], [3]. We build upon this work, presenting a modification to the algorithm developed by Franc and Sonnenburg [2]. We demonstrate empirically that our changes can reduce cuttin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…Several researchers also explore how to train the primal form of (4) and the extended models fast. The existing algorithms can be broadly categorized into two categories: the cutting-plane methods [11,5,12,13,25], and subgradient methods [3,17]. For example, in [17], Shalev-Shwartz et al described and analyzed a simple and effective stochastic sub-gradient descent algorithm and prove that the number of iterations required to obtain a solution of accuracy is O(1/ ).…”
Section: Introductionmentioning
confidence: 99%
“…Several researchers also explore how to train the primal form of (4) and the extended models fast. The existing algorithms can be broadly categorized into two categories: the cutting-plane methods [11,5,12,13,25], and subgradient methods [3,17]. For example, in [17], Shalev-Shwartz et al described and analyzed a simple and effective stochastic sub-gradient descent algorithm and prove that the number of iterations required to obtain a solution of accuracy is O(1/ ).…”
Section: Introductionmentioning
confidence: 99%