2020
DOI: 10.48550/arxiv.2012.03292
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FedSiam: Towards Adaptive Federated Semi-Supervised Learning

Abstract: Federated learning (FL) has emerged as an effective technique to co-training machine learning models without actually sharing data and leaking privacy. However, most existing FL methods focus on the supervised setting and ignore the utilization of unlabeled data. Although there are a few existing studies trying to incorporate unlabeled data into FL, they all fail to maintain performance guarantees or generalization ability in various settings. In this paper, we tackle the federated semi-supervised learning pro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(16 citation statements)
references
References 22 publications
0
16
0
Order By: Relevance
“…Label deficiency in FL. There are a few related works to tackle label deficiency in FL , Long et al, 2020, Itahara et al, 2020, Jeong et al, 2020, Liang et al, 2021, Zhang et al, 2020b. Compared to these works, our proposed SSFL does not use any labels during training.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Label deficiency in FL. There are a few related works to tackle label deficiency in FL , Long et al, 2020, Itahara et al, 2020, Jeong et al, 2020, Liang et al, 2021, Zhang et al, 2020b. Compared to these works, our proposed SSFL does not use any labels during training.…”
Section: Related Workmentioning
confidence: 99%
“…These algorithms all depend on the strong assumption that the data at the edge has sufficient labels. To address the label deficiency issue in FL, recent works , Long et al, 2020, Itahara et al, 2020, Jeong et al, 2020, Liang et al, 2021, Zhang et al, 2020a assume that the server or client has a fraction of labeled data and use semi-supervised methods such as consistency loss [Miyato et al, 2018] or pseudo labeling [Lee, 2013] to train a global model. A more realistic but challenging setting is fully unsupervised training.…”
Section: Introductionmentioning
confidence: 99%
“…Semi-supervised federated learning attempts to use semisupervised learning techniques [19]- [23] to further improve the performance of the FL model in scenarios where there is unlabeled data on the client side [11]. For example, Long et al in [10] proposed a semi-supervised federated learning (SSFL) system, FedSemi, which unifies the consistency-based semisupervised learning model [24], dual model [15], and average teacher model [25] to achieve SSFL. The DS-FL system [26] was proposed to solve the communication overhead problem in SSFL.…”
Section: A Semi-supervised Federated Learningmentioning
confidence: 99%
“…Reference [27] proposes a method to study the distribution of non-IID data, which introduces a probability distance metric to evaluate the difference in client data distribution in SSFL. Different from the literature [9], [10], [26], in this paper, we focus on labels-at-server scenario and also solve the problem of data availability and data heterogeneity in SSFL.…”
Section: A Semi-supervised Federated Learningmentioning
confidence: 99%
See 1 more Smart Citation