2019
DOI: 10.48550/arxiv.1910.06093
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Election Coding for Distributed Learning: Protecting SignSGD against Byzantine Attacks

Jy-yong Sohn,
Dong-Jun Han,
Beongjun Choi
et al.

Abstract: Recent advances in large-scale distributed learning algorithms have enabled communication-efficient training via SIGNSGD. Unfortunately, a major issue continues to plague distributed learning: namely, Byzantine failures may incur serious degradation in learning accuracy. This paper proposes ELECTION CODING, a coding-theoretic framework to guarantee Byzantine-robustness for SIGNSGD WITH MAJORITY VOTE, which uses minimum worker-master communication in both directions. The suggested framework explores new informa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…Convergence analysis on 1-bit SGD is given in (Bernstein et al, 2018a;Karimireddy et al, 2019;Safaryan & Richtárik, 2021). Bernstein et al (2018b); Sohn et al (2019); Le Phong & Phuong (2020); Lyu (2021) investigate the robustness of 1-bit SGD. Perhaps the closest works to this paper are (Tang et al, 2021;Li et al, 2021), which propose using two-stage training to enable 1-bit Adam and 1-bit Lamb, respectively.…”
Section: Related Workmentioning
confidence: 99%
“…Convergence analysis on 1-bit SGD is given in (Bernstein et al, 2018a;Karimireddy et al, 2019;Safaryan & Richtárik, 2021). Bernstein et al (2018b); Sohn et al (2019); Le Phong & Phuong (2020); Lyu (2021) investigate the robustness of 1-bit SGD. Perhaps the closest works to this paper are (Tang et al, 2021;Li et al, 2021), which propose using two-stage training to enable 1-bit Adam and 1-bit Lamb, respectively.…”
Section: Related Workmentioning
confidence: 99%