2022
DOI: 10.1109/twc.2022.3166386
|View full text |Cite
|
Sign up to set email alerts
|

Mobility-Aware Cluster Federated Learning in Hierarchical Wireless Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 64 publications
(17 citation statements)
references
References 24 publications
0
13
0
Order By: Relevance
“…For instance, one can explore the effects of adopting multiple antennas at the AP, which can be used to manoeuvre power boosting and/or interference cancellation [25], on the performance of SFWFL. Investigating the impacts of UEs' mobility on the system's performance is also a concrete direction [26]. Another future extension of the present study is to reduce the sensitivity of SFWFL to heavy-tailed interference via, e.g., the gradient clipping schemes [27] APPENDIX…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, one can explore the effects of adopting multiple antennas at the AP, which can be used to manoeuvre power boosting and/or interference cancellation [25], on the performance of SFWFL. Investigating the impacts of UEs' mobility on the system's performance is also a concrete direction [26]. Another future extension of the present study is to reduce the sensitivity of SFWFL to heavy-tailed interference via, e.g., the gradient clipping schemes [27] APPENDIX…”
Section: Discussionmentioning
confidence: 99%
“…The model training procedure in Algorithm 2 stipulates the following relationship:∆k+1 = ∆k − η k N n=1 ḡn k + η k−D 1 N N n=1 ḡn k−D − gk−D . (34)Taking similar steps as(26), we arrive at the following:E ∆k+1 α α ≤ E ∆k − η k M ∇f ( wk )…”
mentioning
confidence: 95%
“…To train the AI models, we incorporate federated learning (FL) algorithm [1], [2], as shown in Fig. 3 Furthermore, multiple BSs can jointly train the model in a wider range using the hierarchical FL algorithm [11], where a central parameter server aggregates the models from BSs periodically.…”
Section: B Timely Edge Training With Opvsmentioning
confidence: 99%
“…A case study is shown in Section VI. For hierarchical FL, the central aggregation rule also needs to be designed based on the distribution and mobility of OPVs [11].…”
Section: ) Scheduling Policies For Timely Edge Trainingmentioning
confidence: 99%
“…How to preserve the positive gains while avoiding undesired degradation during scaling to hierarchical architectures remains an active research topic. While previous works have studied how to improve FL convergence under one or two of data heterogeneity [31,39,50], system heterogeneity [13,34,38], unexpected stragglers [44], and hierarchical FL for better scalability [21,62], none of existing work provides a systematic solution to address all challenges in a hierarchical and unreliable IoT network. Our work is the first end-to-end framework that uses (i) asynchronous and hierarchical FL algorithm and (ii) system management design to enhance efficiency and robustness, for handling all challenges (C1)-(C4).…”
Section: Introductionmentioning
confidence: 99%