2022
DOI: 10.1109/twc.2022.3153495
|View full text |Cite
|
Sign up to set email alerts
|

Asynchronous Federated Learning Over Wireless Communication Networks

Abstract: In this paper, federated learning (FL) over wireless networks is investigated. In each communication round, a subset of devices is selected to participate in the aggregation with limited time and energy. In order to minimize the convergence time, global loss and latency are jointly considered in a Stackelberg game based framework. Specifically, age of information (AoI) based device selection is considered at leader-level as a global loss minimization problem, while sub-channel assignment, computational resourc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 66 publications
(33 citation statements)
references
References 45 publications
0
33
0
Order By: Relevance
“…Applying (23) to the second term of (44), and (22) to the last term of (44), then according to (19), we have…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Applying (23) to the second term of (44), and (22) to the last term of (44), then according to (19), we have…”
Section: Discussionmentioning
confidence: 99%
“…Despite its benefits in high communication efficiency, Sync FL can result in low training efficiency, since its convergence speed is limited by the slowest users, known as stragglers [22]. To handle the straggler issue, Async FL is a more flexible solution by allowing Async aggregation without waiting for the stragglers [23,24]. However, Async FL has its own unique challenges, including: 1) the staleness problem, where the local model based on an old global model may be harmful to the current aggregation.…”
Section: Introductionmentioning
confidence: 99%
“…Some other notable studies have concentrated on improving the effectiveness of the training process itself. Examples include modifying the fundamental framework of FL, such as a novel asynchronous FL framework that adapts well to the heterogeneity of communication environments [19], an efficient communication FL approach that provides users with a training strategy to achieve fast convergence [20], introducing binary neural networks to replace the conventional neural networks with real values to meet the strict latency and efficiency constraints at the edge of wireless networks [21], and investigating FL over a multihop wireless network with in-network model aggregation [22].…”
Section: Related Workmentioning
confidence: 99%
“…Practically, this would also help to avoid communication overhead due to frequent reallocation of the frequency spectrum. We will consider P3: Minimize 10), ( 11), ( 12), ( 13), ( 14), (19).…”
Section: A Preliminariesmentioning
confidence: 99%
“…Meanwhile, when one device uploads its model, the others continue to complete their training. The authors in [22] proposed a novel asynchronous FL mechanism to coordinate the heterogeneity of devices, communication environments, and learning tasks. Nevertheless, the training round under asynchronous methods is higher than synchronous methods.…”
Section: Introductionmentioning
confidence: 99%