2021 IEEE Information Theory Workshop (ITW) 2021
DOI: 10.1109/itw48936.2021.9611493
|View full text |Cite
|
Sign up to set email alerts
|

Approximate Gradient Coding for Heterogeneous Nodes

Abstract: In distributed machine learning (DML), the training data is distributed across multiple worker nodes to perform the underlying training in parallel. One major problem affecting the performance of DML algorithms is presence of stragglers. These are nodes that are terribly slow in performing their task which results in under-utilization of the training data that is stored in them. Towards this, gradient coding mitigates the impact of stragglers by adding sufficient redundancy in the data. Gradient coding and oth… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 16 publications
0
0
0
Order By: Relevance