2021
DOI: 10.1016/j.jpdc.2020.10.006
|View full text |Cite
|
Sign up to set email alerts
|

Decentralized learning works: An empirical comparison of gossip learning and federated learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
41
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 88 publications
(41 citation statements)
references
References 9 publications
0
41
0
Order By: Relevance
“…In addition, peer-to-peer protocols are also employed in FL to remove the need of a central node [ 105 , 106 ]. Gossip learning [ 107 , 108 ] is an alternative learning framework to FL [ 109 , 110 , 111 , 112 ]. Under gossip learning framework, no central node is required, nodes on the network exchange and aggregate models directly.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, peer-to-peer protocols are also employed in FL to remove the need of a central node [ 105 , 106 ]. Gossip learning [ 107 , 108 ] is an alternative learning framework to FL [ 109 , 110 , 111 , 112 ]. Under gossip learning framework, no central node is required, nodes on the network exchange and aggregate models directly.…”
Section: Discussionmentioning
confidence: 99%
“…Unfortunately, while gossip learning has been shown to be applicable to many different ML workloads [22][23][24], no study that we are aware of tested it in actual physical environments or evaluated its practical use for large-scale deep learning training. However, recent studies suggest that gossip learning compares favorably to federated learning [25] and that it can be extended to work in constrained and highly heterogeneous environments [26].…”
Section: Decentralized Machine Learningmentioning
confidence: 99%
“…With the emergence of infrastructure-less network scenarios, the optimization carried out in FL is moving out of the central server to be handled in a decentralized manner [15]. One of the first methods used to carry out server-less FL is based on the Gossip algorithm [16], which has been applied for deep learning (DL) distributed training with remarkable success [17]. Other solutions such as the one presented in [18] advocate for hybrid settings, where clients can use device-to-device (D2D) links to aggregate information from nodes away from the base station (BS).…”
Section: Related Workmentioning
confidence: 99%