2021
DOI: 10.48550/arxiv.2106.06047
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Rethinking Architecture Design for Tackling Data Heterogeneity in Federated Learning

Abstract: Federated learning is an emerging research paradigm enabling collaborative training of machine learning models among different organizations while keeping data private at each institution. Despite recent progress, there remain fundamental challenges such as lack of convergence and potential for catastrophic forgetting in federated learning across real-world heterogeneous devices. In this paper, we demonstrate that attention-based architectures (e.g., Transformers) are fairly robust to distribution shifts and h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 49 publications
(77 reference statements)
0
1
0
Order By: Relevance
“…To our knowledge, very few works have studied this [55,73]; none are as systematic as ours on the effect of pre-training. [67,38,15] used pre-trained models in their experiments but did not or only briefly analyze their effect.…”
Section: Related Workmentioning
confidence: 99%
“…To our knowledge, very few works have studied this [55,73]; none are as systematic as ours on the effect of pre-training. [67,38,15] used pre-trained models in their experiments but did not or only briefly analyze their effect.…”
Section: Related Workmentioning
confidence: 99%