2021
DOI: 10.48550/arxiv.2102.03448
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Reconstruction: Partially Local Federated Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…Splitting the entire model into a shared part and a local part is another natural approach to personalization. Unlike the stateful algorithms mentioned in Section 7.5.1, Singhal et al [230] proposed an algorithm where the local part is reconstructed locally each time a client participates in a round, and showed that the proposed algorithm also fits in the meta-learning framework.…”
Section: Algorithms That Do Not Require Client-side State or Identifiermentioning
confidence: 99%
“…Splitting the entire model into a shared part and a local part is another natural approach to personalization. Unlike the stateful algorithms mentioned in Section 7.5.1, Singhal et al [230] proposed an algorithm where the local part is reconstructed locally each time a client participates in a round, and showed that the proposed algorithm also fits in the meta-learning framework.…”
Section: Algorithms That Do Not Require Client-side State or Identifiermentioning
confidence: 99%
“…Closely related to federated dropout, ordered dropout Horvath et al [2021] extracts submodels from a main model, and adapts the computation and communication costs to the edge device capabilities. Singhal et al [2021] propose a method to partially reconstruct a model on the edge devices, which results in a reduced communication cost. The reconstructed parameters are local to the clients, and never sent to the server.…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, deep learning language models trained on a large of centralized data have achieved an impressive performance in various NLP downstream tasks (e.g., medical case analysis [1], sentiment analysis [2], nextword prediction in mobile keyboards [3]). Such success is mainly due to advanced machine learning techniques and large-scale data collection.…”
Section: Introductionmentioning
confidence: 99%