2021
DOI: 10.48550/arxiv.2107.14575
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DQ-SGD: Dynamic Quantization in SGD for Communication-Efficient Distributed Learning

Abstract: Gradient quantization is an emerging technique in reducing communication costs in distributed learning. Existing gradient quantization algorithms often rely on engineering heuristics or empirical observations, lacking a systematic approach to dynamically quantize gradients. This paper addresses this issue by proposing a novel dynamically quantized SGD (DQ-SGD) framework, enabling us to dynamically adjust the quantization scheme for each gradient descent step by exploring the trade-off between communication cos… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 1 publication
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?