2020
DOI: 10.48550/arxiv.2001.08737
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Communication Efficient Federated Learning over Multiple Access Channels

Wei-Ting Chang,
Ravi Tandon

Abstract: In this work, we study the problem of federated learning (FL), where distributed users aim to jointly train a machine learning model with the help of a parameter server (PS). In each iteration of FL, users compute local gradients, followed by transmission of the quantized gradients for subsequent aggregation and model updates at PS. One of the challenges of FL is that of communication overhead due to FL's iterative nature and large model sizes. One recent direction to alleviate communication bottleneck in FL i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(19 citation statements)
references
References 18 publications
0
19
0
Order By: Relevance
“…, G. Since clients in cross-silo FL are companies or organizations, the data transmission between clients and the central server can be through either the wired networks (e.g., through the high-speed wired connections [41]) or the wireless networks (e.g., using transmission protocols TDMA or OFDMA [37]). We assume that each client has the same communication resource [42], and the parameters of local models have the same size. In this case, clients experience the same communication cost (e.g., transmission delay), which depends on the number of iterations between clients and the central server [40].…”
Section: Communication Costmentioning
confidence: 99%
“…, G. Since clients in cross-silo FL are companies or organizations, the data transmission between clients and the central server can be through either the wired networks (e.g., through the high-speed wired connections [41]) or the wireless networks (e.g., using transmission protocols TDMA or OFDMA [37]). We assume that each client has the same communication resource [42], and the parameters of local models have the same size. In this case, clients experience the same communication cost (e.g., transmission delay), which depends on the number of iterations between clients and the central server [40].…”
Section: Communication Costmentioning
confidence: 99%
“…In (11), moments of S are not easy to calculate as S is equal to an order statistic of a sum of exponential and shifted exponential random variables. To simplify, we assume that downlink transmissions are instantaneous since, in general, connection speeds are significantly asymmetric such that downlink transmissions are much faster than uplink transmissions [23].…”
Section: Average Age Analysismentioning
confidence: 99%
“…Some promising applications of FL are image classification and nextword prediction [1], human mobility prediction [2], news recommenders and interactive social networks [3], healthcare applications [4], and so on. Recent works in [5]- [11] study communication-efficient FL frameworks suitable for the limited communication between the PS and the clients considering varying channel conditions, quantization and sparsification, non-i.i.d. client datasets, and coding.…”
Section: Introductionmentioning
confidence: 99%
“…Lazy updates are introduced in (Chen et al, 2018a,b), and combinations of non-periodic updates and quantization is explored in (Reisizadeh et al, 2020). Furthermore, when distributed learning is taking place over a wireless network, there is an interest in allocating the available network resources efficiently among the users holding the data (Gündüz et al, 2019), such as power (Chen et al, 2019) or rates (Chang and Tandon, 2020). The problem of scheduling gradient updates over multiple access channels has also received initial consideration, for example comparing time-based approaches with approaches based on channel conditions , or including gradient information (Amiri et al, 2020;Chen et al, 2020).…”
Section: Introductionmentioning
confidence: 99%