2022
DOI: 10.1109/jiot.2021.3101991
|View full text |Cite
|
Sign up to set email alerts
|

Differential Privacy Meets Federated Learning Under Communication Constraints

Abstract: The performance of federated learning systems is bottlenecked by communication costs and training variance. The communication overhead problem is usually addressed by three communication-reduction techniques, namely, model compression, partial device participation, and periodic aggregation, at the cost of increased training variance. Different from traditional distributed learning systems, federated learning suffers from data heterogeneity (since the devices sample their data from possibly different distributi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(5 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…Although the assurance of local user data privacy is a standout feature of FL, malicious actors may still be able to glean critical system information from model changes [10]. Although newer methods such as secure multiparty computation (SMC) [11], differential privacy [12] or secure aggregation [5] seek to improve the privacy of FL, these approaches generally sacrifice inference performance for privacy. Understanding and balancing these costs is a significant difficulty in implementing private FL systems, both theoretically and practically [13].…”
Section: B Privacy Issues In Flmentioning
confidence: 99%
“…Although the assurance of local user data privacy is a standout feature of FL, malicious actors may still be able to glean critical system information from model changes [10]. Although newer methods such as secure multiparty computation (SMC) [11], differential privacy [12] or secure aggregation [5] seek to improve the privacy of FL, these approaches generally sacrifice inference performance for privacy. Understanding and balancing these costs is a significant difficulty in implementing private FL systems, both theoretically and practically [13].…”
Section: B Privacy Issues In Flmentioning
confidence: 99%
“…In addition to that, most of the privacy-preserving distributed learning methods mentioned above consider periodical local model updates from the local devices and then send the aggregated model back to all devices. This results in substantial communication overhead [25]. While considering both aspects, we propose a highperforming ensemble distributed learning system with knowledge transfer based on differential privacy, which does not require a continuous local model update.…”
Section: Related Workmentioning
confidence: 99%
“…Concretely, non-iid data requires more communication rounds to achieve the desired performance but more rounds mean more noise addition which degrades the utility. To our best knowledge, only a few papers [12]- [14] simultaneously investigate the non-iid and DP problem. These works either utilize the full client sampling or MD client sampling that ignores the impact of the heterogeneous client due to non-iid data.…”
Section: Introductionmentioning
confidence: 99%