2020
DOI: 10.48550/arxiv.2007.02191
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Coded Distributed Computing with Partial Recovery

Abstract: Coded computation techniques provide robustness against straggling workers in distributed computing. However, most of the existing schemes require exact provisioning of the straggling behaviour and ignore the computations carried out by straggling workers. Moreover, these schemes are typically designed to recover the desired computation results accurately, while in many machine learning and iterative optimization algorithms, faster approximate solutions are known to result in an improvement in the overall conv… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…This is a different form of approximation as compared to that considered in previous works on gradient coding mentioned above and such forms of approximate gradient recovery have found application in distributed learning algorithms [15], [16]. A similar objective function was also studied recently in [12], [24], [25], where the goal was to design strategies which benefit from both uncoded and coded computing schemes, and extensive numerical simulations were done to illustrate the advantages of allowing partial recovery. Finally, we would like to point out that while [21] studied approximate gradient coding in terms of 2 error, their gradient code construction based on Batched Raptor codes can in fact be applied to the partial recovery framework being studied here as well.…”
Section: Arxiv:210210163v1 [Csit] 19 Feb 2021 B Approximate Gradient ...mentioning
confidence: 99%
“…This is a different form of approximation as compared to that considered in previous works on gradient coding mentioned above and such forms of approximate gradient recovery have found application in distributed learning algorithms [15], [16]. A similar objective function was also studied recently in [12], [24], [25], where the goal was to design strategies which benefit from both uncoded and coded computing schemes, and extensive numerical simulations were done to illustrate the advantages of allowing partial recovery. Finally, we would like to point out that while [21] studied approximate gradient coding in terms of 2 error, their gradient code construction based on Batched Raptor codes can in fact be applied to the partial recovery framework being studied here as well.…”
Section: Arxiv:210210163v1 [Csit] 19 Feb 2021 B Approximate Gradient ...mentioning
confidence: 99%