2021
DOI: 10.3390/electronics10172081
|View full text |Cite
|
Sign up to set email alerts
|

Communication Cost Reduction with Partial Structure in Federated Learning

Abstract: Federated learning is a distributed learning algorithm designed to train a single server model on a server using different clients and their local data. To improve the performance of the server model, continuous communication with clients is required, and since the number of clients is very large, the algorithm must be designed in consideration of the cost required for communication. In this paper, we propose a method for distributing a model with a structure different from that of the server model, distributi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 24 publications
0
4
0
Order By: Relevance
“…We note that, in many cases, data sharing is not free from security, regulatory and privacy issues [8]. We also note that the communication cost for the centralized learning depends on the number/size of the collected data [22,23]. In contrast, the communication cost for the federated learning is independent with the data size, but depends on the CNN architecture (specifically, the number of weights in the CNN).…”
Section: Federated Learning For Handwriting Character Recognitionmentioning
confidence: 99%
“…We note that, in many cases, data sharing is not free from security, regulatory and privacy issues [8]. We also note that the communication cost for the centralized learning depends on the number/size of the collected data [22,23]. In contrast, the communication cost for the federated learning is independent with the data size, but depends on the CNN architecture (specifically, the number of weights in the CNN).…”
Section: Federated Learning For Handwriting Character Recognitionmentioning
confidence: 99%
“…Finally, as for the last method, each client can train and upload a subset of the model (i.e., the shallow and deep layers can be split) to save related costs. Kang et al [21], for example, distributed a model according to data sizes of clients, and Wang et al [22] divided the global model into branches according to sample categories.…”
Section: Communication Cost Reductionmentioning
confidence: 99%
“…Caldas et al [12] Dropout compression Wang et al [13] SVD compression Sattler et al [14] Sparse compression Asad et al [15] Sparce compression Lu et al [16] Threshold compression Zhu et al [17] Frequency reduction Huang et al [18] Frequency reduction Wang et al [19] Frequency reduction Wang et al [20] Frequency reduction Kang et al [21] Training split Wang et al [22] Training split…”
Section: Aggregation Based On Informative Attributesmentioning
confidence: 99%
“…Continuous communication in federated learning requires an extensive number of communication rounds between the client and server due to the sizeable data and number of clients. This necessitates the use of effective algorithms and models to reduce the cost between the client and the global model [120]. To that end, a number of approaches have been used to decrease communication rounds and improve performance.…”
Section: Communication Costmentioning
confidence: 99%