2021
DOI: 10.48550/arxiv.2103.15195
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MergeComp: A Compression Scheduler for Scalable Communication-Efficient Distributed Training

Zhuang Wang,
Xinyu Wu,
T. S. Eugene Ng

Abstract: Large-scale distributed training is increasingly becoming communication bound. Many gradient compression algorithms have been proposed to reduce the communication overhead and improve scalability. However, it has been observed that in some cases gradient compression may even harm the performance of distributed training.In this paper, we propose MergeComp, a compression scheduler to optimize the scalability of communication-efficient distributed training. It automatically schedules the compression operations to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 32 publications
(60 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?