2019
DOI: 10.48550/arxiv.1902.06332
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Quantized Frank-Wolfe: Faster Optimization, Lower Communication, and Projection Free

Mingrui Zhang,
Lin Chen,
Aryan Mokhtari
et al.

Abstract: How can we efficiently mitigate the overhead of gradient communications in distributed optimization? This problem is at the heart of training scalable machine learning models and has been mainly studied in the unconstrained setting. In this paper, we propose Quantized Frank-Wolfe (QFW), the first projection-free and communication-efficient algorithm for solving constrained optimization problems at scale. We consider both convex and non-convex objective functions, expressed as a finite-sum or more generally a s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…In the first experiment, we consider an OCO problem -online multiclass logistic regression (Zhang et al 2019) f t as the multiclass logistic loss function…”
Section: Online Multiclass Logistic Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…In the first experiment, we consider an OCO problem -online multiclass logistic regression (Zhang et al 2019) f t as the multiclass logistic loss function…”
Section: Online Multiclass Logistic Regressionmentioning
confidence: 99%
“…In the second experiment, we focus on training a onehidden-layer neural network with an additional ℓ 1 norm constraint (Zhang et al 2019). Specifically, given a multiclass data set {(a i , y i )} n i=1 with (a i , y i ) ∈ R d × {1, .…”
Section: Training a One-hidden-layer Neural Networkmentioning
confidence: 99%