2020
DOI: 10.48550/arxiv.2007.15197
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Communication-Efficient Federated Learning via Optimal Client Sampling

Abstract: Federated learning is a private and efficient framework for learning models in settings where data is distributed across many clients. Due to interactive nature of the training process, frequent communication of large amounts of information is required between the clients and the central server which aggregates local models. We propose a novel, simple and efficient way of updating the central model in communication-constrained settings by determining the optimal client sampling policy. In particular, modeling … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 18 publications
(29 citation statements)
references
References 44 publications
0
29
0
Order By: Relevance
“…datasets as indicated in their results. Some other empirical studies with similar approaches include [15], [16], but also do not consider or derive convergence bounds for their selection strategies. Some later work began to include analysis of the convergence of FL with device selection.…”
Section: R Wmentioning
confidence: 99%
See 3 more Smart Citations
“…datasets as indicated in their results. Some other empirical studies with similar approaches include [15], [16], but also do not consider or derive convergence bounds for their selection strategies. Some later work began to include analysis of the convergence of FL with device selection.…”
Section: R Wmentioning
confidence: 99%
“…Since the objective is an independent sum over n, we can perform the minimization separately for each device n. Algorithm 2 details to process in determining the optimal P n (t) and q t n in each round. We now present Theorem 2 which gives an analytical solution to (15) that can be computed distributively by the devices.…”
Section: Algorithm 2: Stochastic Client Samplingmentioning
confidence: 99%
See 2 more Smart Citations
“…They propose a federated averaging scheme where the aggregation is weighted by probabilities of devices being inactive at a given communication round. Another selection approach is proposed in [9]. The approach suggests that clients with the most significant local updates are selected.…”
Section: Introductionmentioning
confidence: 99%