2021 IEEE Global Communications Conference (GLOBECOM) 2021
DOI: 10.1109/globecom46510.2021.9685230
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Multi-Model Training Algorithm for Federated Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 6 publications
1
4
0
Order By: Relevance
“…Thus, the higher throughput brought by a larger batch size does not necessarily shorten training time. The result of the relationship between convergence efficiency and batch size also matches recent study [58].…”
Section: Ablation Studysupporting
confidence: 88%
“…Thus, the higher throughput brought by a larger batch size does not necessarily shorten training time. The result of the relationship between convergence efficiency and batch size also matches recent study [58].…”
Section: Ablation Studysupporting
confidence: 88%
“…Training multiple models in a federated setting has been explored in [16] also. However, the setting and the model training methodology are different from ours [3].…”
Section: Related Workmentioning
confidence: 99%
“…However, the setting and the model training methodology are different from ours [3]. In [16], client distribution among models has been approached as an optimization problem and a heuristic algorithm has been proposed. However, neither an optimal algorithm is provided nor any convergence guarantees.…”
Section: Related Workmentioning
confidence: 99%
“…The idea is that these two approaches together will generate a generic model that can be used globally. Another new area of research is multi-model federated learning [29][30][31].…”
Section: Motivation 121 Problem Statementmentioning
confidence: 99%
“…and Moharir [30] proposed multi-model federated learning by simultaneously training multiple uncorrelated models in federated learning. The researchers in [29,31], also worked on multi-model federated learning. As pointed out in Chapter 1, research on multi-modality federated learning is extremely limited [25][26][27][28].…”
Section: Multi-modality Federated Learningmentioning
confidence: 99%