2022
DOI: 10.3390/s22165983
|View full text |Cite
|
Sign up to set email alerts
|

Combined Federated and Split Learning in Edge Computing for Ubiquitous Intelligence in Internet of Things: State-of-the-Art and Future Directions

Abstract: Federated learning (FL) and split learning (SL) are two emerging collaborative learning methods that may greatly facilitate ubiquitous intelligence in the Internet of Things (IoT). Federated learning enables machine learning (ML) models locally trained using private data to be aggregated into a global model. Split learning allows different portions of an ML model to be collaboratively trained on different workers in a learning framework. Federated learning and split learning, each have unique advantages and re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 49 publications
(22 citation statements)
references
References 103 publications
0
22
0
Order By: Relevance
“…In an inference attack, the attacker aims to exchange gradients during the FL training process, which can result in serious leakage of information about clients' training data features. Inference attacks include inferring class representatives [66], inferring membership [67], inferring data properties [68], and inferring samples/labels [69]. In the inference of class representatives, the adversary creates samples that are not in the original training dataset.…”
Section: Privacy Leakage and Threats In Flmentioning
confidence: 99%
“…In an inference attack, the attacker aims to exchange gradients during the FL training process, which can result in serious leakage of information about clients' training data features. Inference attacks include inferring class representatives [66], inferring membership [67], inferring data properties [68], and inferring samples/labels [69]. In the inference of class representatives, the adversary creates samples that are not in the original training dataset.…”
Section: Privacy Leakage and Threats In Flmentioning
confidence: 99%
“…In vanilla SL, the local model makes one or more training passes over the local dataset (alternative client training). This may cause "catastrophic forgetting" issue (Gawali et al, 2020;Duan et al, 2022). So SplitFedv3 propose to use alternate mini-batch training, where a client updates its client-side model on one mini-batch, after which the client next in order takes over (Gawali et al, 2020), to mitigate the issue.…”
Section: A Additional Related Workmentioning
confidence: 99%
“…FedSeq (Zaccone et al, 2022) is the same as SFLG except that the training inside the group is identical to vanilla SL. More variants can be found in ; Duan et al (2022).…”
Section: A Additional Related Workmentioning
confidence: 99%
“…Moreover, offloading training tasks from FL participant devices is essentially aligned with the emerging Split Learning (SL) paradigm [216]. Recently, encouraging progress has been made toward combining FL and SL in the same framework for improving learning performance [217], which opens another avenue to fully leverage edge computing for supporting collaborative learning. Therefore, the integration of FL and SL in an edge environment is expected to be an interesting topic for future research.…”
Section: Releasing Computation Loads From Fl Participant Devicesmentioning
confidence: 99%