2021 60th IEEE Conference on Decision and Control (CDC) 2021
DOI: 10.1109/cdc45484.2021.9683443
|View full text |Cite
|
Sign up to set email alerts
|

Federated Learning with Incrementally Aggregated Gradients

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

1
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(29 citation statements)
references
References 9 publications
1
28
0
Order By: Relevance
“…SGD with biased noise. Many algorithms can be viewed as SGD with structured but potentially biased noise, including SGD with (biased) compression (Stich et al, 2018;Gorbunov et al, 2020), delayed SGD (Mania et al, 2017;Dutta et al, 2018), local SGD (Stich, 2019), federated learning methods (Karimireddy et al, 2020;Yuan & Ma, 2020;Mitra et al, 2021;Nguyen et al, 2022), decentralized optimization methods (Yu et al, 2019;Koloskova et al, 2020), and many others. Convergence analyses for such methods often use techniques like perturbed iterate analysis (Mania et al, 2017).…”
Section: Related Workmentioning
confidence: 99%
“…SGD with biased noise. Many algorithms can be viewed as SGD with structured but potentially biased noise, including SGD with (biased) compression (Stich et al, 2018;Gorbunov et al, 2020), delayed SGD (Mania et al, 2017;Dutta et al, 2018), local SGD (Stich, 2019), federated learning methods (Karimireddy et al, 2020;Yuan & Ma, 2020;Mitra et al, 2021;Nguyen et al, 2022), decentralized optimization methods (Yu et al, 2019;Koloskova et al, 2020), and many others. Convergence analyses for such methods often use techniques like perturbed iterate analysis (Mania et al, 2017).…”
Section: Related Workmentioning
confidence: 99%
“…As one of the earliest methods, FedAvg has been shown to effectively reduce the communication cost (McMahan et al, 2017). An increasing number of variants of FedAvg have been further proposed to address the issues such as the slow convergence and client drift via regularization (Li et al, 2020;Acar et al, 2021), variance reduction (Mitra et al, 2021;Karimireddy et al, 2020), proximal splitting (Pathak & Wainwright, 2020) and adaptive optimization (Reddi et al, 2020). In the homogeneous setting, FedAvg is relevant to local SGD, and has been analyzed in Stich 2019; Wang & Joshi 2018;Stich & Karimireddy 2019;Basu et al 2019.…”
Section: Related Workmentioning
confidence: 99%
“…In the homogeneous setting, FedAvg is relevant to local SGD, and has been analyzed in Stich 2019; Wang & Joshi 2018;Stich & Karimireddy 2019;Basu et al 2019. In the heterogeneous setting, Li et al 2020;Mitra et al 2021;Li et al 2019;Khaled et al 2019 provided the convergence analysis of their methods.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Federated learning: At the core of federated learning is the prevailing FedAvg algorithm and its variants (McMahan et al, 2017;Karimireddy et al, 2019;Mitra et al, 2021;Acar et al, 2021;Stich, 2018;Yu et al, 2019;Qu et al, 2020) to address the communication efficiency and the data privacy concerns. We review literature with a focus on the analysis of the linear speedup for convergence.…”
Section: Related Workmentioning
confidence: 99%