2014
DOI: 10.1016/j.ins.2014.03.059
|View full text |Cite
|
Sign up to set email alerts
|

A novel Frank–Wolfe algorithm. Analysis and applications to large-scale SVM training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
46
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 28 publications
(46 citation statements)
references
References 25 publications
0
46
0
Order By: Relevance
“…On the theoretical side, FW methods come with iteration complexity bounds that are independent of the number of variables in the problem, and sparsity guarantees that hold during the whole execution of the algorithm [3,22]. In addition, several variants of the basic procedure have been analyzed, which can improve the convergence rate and practical performance of the basic FW iteration [15,35,26,6]. From a practical point of view, they have emerged as efficient alternatives to traditional methods in several contexts, such as large-scale SVM classification [7,8,35,6] and nuclear norm-regularized matrix recovery [22,42].…”
Section: Frank-wolfe Optimizationmentioning
confidence: 99%
See 3 more Smart Citations
“…On the theoretical side, FW methods come with iteration complexity bounds that are independent of the number of variables in the problem, and sparsity guarantees that hold during the whole execution of the algorithm [3,22]. In addition, several variants of the basic procedure have been analyzed, which can improve the convergence rate and practical performance of the basic FW iteration [15,35,26,6]. From a practical point of view, they have emerged as efficient alternatives to traditional methods in several contexts, such as large-scale SVM classification [7,8,35,6] and nuclear norm-regularized matrix recovery [22,42].…”
Section: Frank-wolfe Optimizationmentioning
confidence: 99%
“…The key observation is that, when Σ is a polytope (e.g. the unit simplex for L 2 -SVMs [35], the ℓ 1 -ball of radius δ for the Lasso problem (1), a spectrahedron in nuclear norm for matrix recovery [14]), the search in step 3 can be reduced to a search among the vertices of Σ. This allows to devise cheap analytical formulas to find u (k) , ensuring that each iteration has an overall cost of O(p).…”
Section: : End Formentioning
confidence: 99%
See 2 more Smart Citations
“…These two FW variants, as well as others in the literature [44,45,46,35] converge faster than FW. However, as setting the step-size is more challenging and keeping track of the active atoms further complicates the algorithm, in this thesis we focus on parallelizing the classic FW.…”
Section: Fw Variantsmentioning
confidence: 60%