2022
DOI: 10.48550/arxiv.2202.02491
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Distributed Learning With Sparsified Gradient Differences

Abstract: A very large number of communications are typically required to solve distributed learning tasks, and this critically limits scalability and convergence speed in wireless communications applications. In this paper, we devise a Gradient Descent method with Sparsification and Error Correction (GD-SEC) to improve the communications efficiency in a general worker-server architecture. Motivated by a variety of wireless communications learning scenarios, GD-SEC reduces the number of bits per communication from worke… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 42 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?