2021
DOI: 10.48550/arxiv.2108.09749
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Flexible Clustered Federated Learning for Client-Level Data Distribution Shift

Abstract: Federated Learning (FL) enables the multiple participating devices to collaboratively contribute to a global neural network model while keeping the training data locally. Unlike the centralized training setting, the non-IID, imbalanced (statistical heterogeneity) and distribution shifted training data of FL is distributed in the federated network, which will increase the divergences between the local models and the global model, further degrading performance. In this paper, we propose a flexible clustered fede… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…Previous works that have considered collaborative training and server failure include [35], [36] in the context of FL and [12], [32], [37] that consider gossip-based training schemes. While gossip-based methods rely on random walks of data, the former, FL modification methods opt for a clustering method from which Tol-FL draws its inspiration.…”
Section: B Prior Federated Learning Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous works that have considered collaborative training and server failure include [35], [36] in the context of FL and [12], [32], [37] that consider gossip-based training schemes. While gossip-based methods rely on random walks of data, the former, FL modification methods opt for a clustering method from which Tol-FL draws its inspiration.…”
Section: B Prior Federated Learning Researchmentioning
confidence: 99%
“…While gossip-based methods rely on random walks of data, the former, FL modification methods opt for a clustering method from which Tol-FL draws its inspiration. Ignoring the constraints of local communication, the FL-based methods [35], [36], [11] determine a natural grouping scheme over all of the available devices based on the similarity between their datasets. This modified scheme discards the speed benefit that arises through single-hop communications and practical considerations of link availability and instead forming virtual clusters that may include devices with large communications delays.…”
Section: B Prior Federated Learning Researchmentioning
confidence: 99%