ICC 2021 - IEEE International Conference on Communications 2021
DOI: 10.1109/icc42927.2021.9500890
|View full text |Cite
|
Sign up to set email alerts
|

Fast-Convergent Federated Learning with Adaptive Weighting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 18 publications
(25 citation statements)
references
References 8 publications
0
25
0
Order By: Relevance
“…Providing the variety of different nodes on contributing global model, to improve the convergence rate, one can seek to preferentially select the nodes with higher contribution (i.e., the nodes with i.i.d. dataset, as observed in [6], [18]). As such, we propose a probabilistic node selection design, which is an unbiased node selection scheme that can dynamically change the probability for each node to be selected in each communication round, based on their data distribution-related contribution, which can be distinguished by the procedure CHECK EXPECTATION in Optimal Aggregation.…”
Section: Fl With Probabilistic Node Selection (Fedpns)mentioning
confidence: 65%
See 3 more Smart Citations
“…Providing the variety of different nodes on contributing global model, to improve the convergence rate, one can seek to preferentially select the nodes with higher contribution (i.e., the nodes with i.i.d. dataset, as observed in [6], [18]). As such, we propose a probabilistic node selection design, which is an unbiased node selection scheme that can dynamically change the probability for each node to be selected in each communication round, based on their data distribution-related contribution, which can be distinguished by the procedure CHECK EXPECTATION in Optimal Aggregation.…”
Section: Fl With Probabilistic Node Selection (Fedpns)mentioning
confidence: 65%
“…data distribution across participating nodes, aiming to reduce communication rounds in FL. These studies include adaptive tuning local training [9], weighting design for model aggregation [18], and node selection strategies [19]- [22]. The algorithm FedProx by Li et al [9] uses a regularization term to balance between the optimizing discrepancy of global-local objectives and allowing participating nodes to perform a variable number of local updates, to consequently overcome the non-i.i.d.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Meanwhile, the influence of malicious models should be lowered as well. We propose using the weighted federated averaging method [16], a modified version of FedAvg, in this paper. The modified FedAvg mechanism works together with the reputation system to keep the federated ML secured and stable.…”
Section: Weighted Federated Averagingmentioning
confidence: 99%