2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2017
DOI: 10.1109/allerton.2017.8262879
|View full text |Cite
|
Sign up to set email alerts
|

A sequential approximation framework for coded distributed optimization

Abstract: Building on the previous work of Lee et al. [2] and Ferdinand et al. [3] on coded computation, we propose a sequential approximation framework for solving optimization problems in a distributed manner. In a distributed computation system, latency caused by individual processors ("stragglers") usually causes a significant delay in the overall process. The proposed method is powered by a sequential computation scheme, which is designed specifically for systems with stragglers. This scheme has the desirable prope… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
26
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(26 citation statements)
references
References 12 publications
0
26
0
Order By: Relevance
“…Therefore, to provide another estimate with lower error, the input matrices should be newly compressed and a new coding step need to be used for the new random samples of inputs. Another line of work [16] and [17] brings the idea of approximate computing to the encoding and decoding steps. Using maximum distance separable (MDS) codes, [16] and [17] guarantee accuracy improvement over time by intelligently prioritizing and allocating tasks among workers.…”
Section: A Background 1) Polynomial-based Cdcmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, to provide another estimate with lower error, the input matrices should be newly compressed and a new coding step need to be used for the new random samples of inputs. Another line of work [16] and [17] brings the idea of approximate computing to the encoding and decoding steps. Using maximum distance separable (MDS) codes, [16] and [17] guarantee accuracy improvement over time by intelligently prioritizing and allocating tasks among workers.…”
Section: A Background 1) Polynomial-based Cdcmentioning
confidence: 99%
“…Another line of work [16] and [17] brings the idea of approximate computing to the encoding and decoding steps. Using maximum distance separable (MDS) codes, [16] and [17] guarantee accuracy improvement over time by intelligently prioritizing and allocating tasks among workers. These methods while effective, are applied only to matrix-vector multiplication.…”
Section: A Background 1) Polynomial-based Cdcmentioning
confidence: 99%
“…The shifted exponential model we considered in above does not reflect this behavior. Thus, k j " n is a possible solution to (9). To robustify the solution to the possible presence of persistent stragglers we change the optimization problem in (9) to max z s.t.…”
Section: Asymptotic Analysismentioning
confidence: 99%
“…The second coded computation, the focus of this paper, concept [22] provides resiliency to stragglers and can be utilized to mitigate tail latency in distributed computing [11,22,27,34,38,42]. In particular, several of these works target distributed machine learning.There have been few recent works in the coded computing literature to exploit the computations of slow nodes [41,47], however the key ingredient of our proposed strategy is that it dynamically adapts the computation load of each node to its estimated speed from the previous rounds of computations.…”
Section: Related Workmentioning
confidence: 99%