2021
DOI: 10.48550/arxiv.2104.06685
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

BROADCAST: Reducing Both Stochastic and Compression Noise to Robustify Communication-Efficient Federated Learning

Abstract: Communication between workers and the master node to collect local stochastic gradients is a key bottleneck in a large-scale federated learning system. Various recent works have proposed to compress the local stochastic gradients to mitigate the communication overhead. However, robustness to malicious attacks is rarely considered in such a setting. In this work, we investigate the problem of Byzantine-robust federated learning with compression, where the attacks from Byzantine workers can be arbitrarily malici… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(7 citation statements)
references
References 18 publications
0
7
0
Order By: Relevance
“…(2) Although the analyses by Wu et al [2020], Zhu and Ling [2021] support inexact geometric median computation, for simplicity of presentation, we assume that geometric median is computed exactly.…”
Section: Br-mvrmentioning
confidence: 99%
See 4 more Smart Citations
“…(2) Although the analyses by Wu et al [2020], Zhu and Ling [2021] support inexact geometric median computation, for simplicity of presentation, we assume that geometric median is computed exactly.…”
Section: Br-mvrmentioning
confidence: 99%
“…In particular, compressed methods are analyzed in the conjuction with variance reduction [Horváth et al, 2019b, Gorbunov et al, 2020b, Danilova and Gorbunov, 2022, acceleration , Li and Richtárik, 2021, Qian et al, 2021b, decentralized communications [Koloskova et al, 2019, Kovalev et al, 2021, local steps [Basu et al, 2019, Haddadpour et al, 2021, adaptive compression [Faghri et al, 2020], second-order methods , Safaryan et al, 2021, and min-max optimization [Beznosikov et al, 2021[Beznosikov et al, , 2022. However, to our knowledge, only one work studies communication compression in the context of Byzantine-robustness [Zhu and Ling, 2021] (see the discussion above Q2). Our work makes a further step towards closing this significant gap in the literature.…”
Section: A Detailed Related Workmentioning
confidence: 99%
See 3 more Smart Citations