2021
DOI: 10.48550/arxiv.2110.13388
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Semi-Supervised Federated Learning with non-IID Data: Algorithm and System Design

Abstract: Federated Learning (FL) allows edge devices (or clients) to keep data locally while simultaneously training a shared high-quality global model. However, current research is generally based on an assumption that the training data of local clients have ground-truth. Furthermore, FL faces the challenge of statistical heterogeneity, i.e., the distribution of the client's local training data is non-independent identically distributed (non-IID). In this paper, we present a robust semi-supervised FL system design, wh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 18 publications
(28 reference statements)
0
4
0
Order By: Relevance
“…• RQ3: How does each module take effects in the framework to achieve personalized efficiency federated learning under the semi-supervised setting? SSFL [39], FedMix [38], FedSEAL [1], and SemiFL [4] as baselines.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…• RQ3: How does each module take effects in the framework to achieve personalized efficiency federated learning under the semi-supervised setting? SSFL [39], FedMix [38], FedSEAL [1], and SemiFL [4] as baselines.…”
Section: Methodsmentioning
confidence: 99%
“…As there is limited work [14] on federated learning in natural language processing, we choose the identical baselines with image tasks. However, according to the intuition and typical design for image tasks, SSFL [39] and FedMix [38] are not fit for text tasks. SSFL is based on replacing batch normalization with group normalization.…”
Section: Performance Evaluationmentioning
confidence: 99%
See 2 more Smart Citations