2021
DOI: 10.48550/arxiv.2104.14628
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Cluster-driven Graph Federated Learning over Multiple Domains

Abstract: Federated Learning (FL) deals with learning a central model (i.e. the server) in privacy-constrained scenarios, where data are stored on multiple devices (i.e. the clients). The central model has no direct access to the data, but only to the updates of the parameters computed locally by each client. This raises a problem, known as statistical heterogeneity, because the clients may have different data distributions (i.e. domains). This is only partly alleviated by clustering the clients. Clustering may reduce h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…For example, hierarchical federated learning adds an additional layer of hierarchical structure to federated learning, which could be used to reduce latency or to cluster together similar players (Lin et al [2018], Liu et al [2020). Many other works also relate to clustering, such as (Lee et al [2020], Sattler et al [2020], Shlezinger et al [2020], Wang et al [2020], Duan et al, Jamali-Rad et al [2021], Caldarola et al [2021]). These works, which tend to be more applied than our work, may also differ in that they analyze situations where additional information is known, such as the data distribution at each location.…”
Section: Related Workmentioning
confidence: 99%
“…For example, hierarchical federated learning adds an additional layer of hierarchical structure to federated learning, which could be used to reduce latency or to cluster together similar players (Lin et al [2018], Liu et al [2020). Many other works also relate to clustering, such as (Lee et al [2020], Sattler et al [2020], Shlezinger et al [2020], Wang et al [2020], Duan et al, Jamali-Rad et al [2021], Caldarola et al [2021]). These works, which tend to be more applied than our work, may also differ in that they analyze situations where additional information is known, such as the data distribution at each location.…”
Section: Related Workmentioning
confidence: 99%