2022
DOI: 10.48550/arxiv.2206.01906
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hybrid Architectures for Distributed Machine Learning in Heterogeneous Wireless Networks

Abstract: The ever-growing data privacy concerns have transformed machine learning (ML) architectures from centralized to distributed, leading to federated learning (FL) and split learning (SL) as the two most popular privacy-preserving ML paradigms. However, implementing either conventional FL or SL alone with diverse network conditions (e.g., device-to-device (D2D) and cellular communications) and heterogeneous clients (e.g., heterogeneous computation/communication/energy capabilities) may face significant challenges,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 13 publications
(23 reference statements)
0
1
0
Order By: Relevance
“…is outpacing the increase in computation power of existing centralized (cloud) infrastructures. Moreover, supporting frequent transmission of huge amount of training data towards the cloud is a challenging task even for wired links [8].…”
Section: Introductionmentioning
confidence: 99%
“…is outpacing the increase in computation power of existing centralized (cloud) infrastructures. Moreover, supporting frequent transmission of huge amount of training data towards the cloud is a challenging task even for wired links [8].…”
Section: Introductionmentioning
confidence: 99%