2021
DOI: 10.48550/arxiv.2109.04533
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FedCon: A Contrastive Framework for Federated Semi-Supervised Learning

Abstract: Federated Semi-Supervised Learning (FedSSL) has gained rising attention from both academic and industrial researchers, due to its unique characteristics of co-training machine learning models with isolated yet unlabeled data. Most existing FedSSL methods focus on the classical scenario, i.e, the labeled and unlabeled data are stored at the client side. However, in real world applications, client users may not provide labels without any incentive. Thus, the scenario of labels at the server side is more practica… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 23 publications
(28 reference statements)
0
6
0
Order By: Relevance
“…Moreover, two federated self-supervised learning frameworks for images with limited labels was proposed in [39], based on federated CLR with feature sharing (FedCLF). In contrast to previous approaches that assume labeled data are available at the client-side, Long et al [26] introduced FedCon, a novel framework designed to address scenarios where local client data is unlabeled and only the server has access to labeled data. FedCon tackles this challenge by employing a contrastive network architecture, which consists of two subnetworks, enabling effective handling of the unlabeled data at the client-side.…”
Section: Federated Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, two federated self-supervised learning frameworks for images with limited labels was proposed in [39], based on federated CLR with feature sharing (FedCLF). In contrast to previous approaches that assume labeled data are available at the client-side, Long et al [26] introduced FedCon, a novel framework designed to address scenarios where local client data is unlabeled and only the server has access to labeled data. FedCon tackles this challenge by employing a contrastive network architecture, which consists of two subnetworks, enabling effective handling of the unlabeled data at the client-side.…”
Section: Federated Learningmentioning
confidence: 99%
“…Research groups have devoted resources for approaches specifically designed for federated settings, leveraging the power of representation learning while respecting the decentralized nature of data. These approaches often involve a range of learning techniques, such as self-supervised, semi-supervised, unsupervised learning, and transfer learning [44,39,26,18,11]. In the context of federated learning This ICCV workshop paper is the Open Access version, provided by the Computer Vision Foundation.…”
Section: Introductionmentioning
confidence: 99%
“…This scenario is referred to as FSSL with global semisupervision [40]. In other applications where large labeled datasets are more readily available or can be transferred, the roles of the aggregating server and the labeled data owner may overlap, and this scenario is instead referred to as the labels-at-server scenario [34]- [38].…”
Section: B Federated Learning With Unlabeled Datamentioning
confidence: 99%
“…[37] adapted a combination of two stateof-the-art semi-supervised methods, FixMatch [17] and Mix-Match [19], in a federated setting. [38] and [39] proposed methods based on contrastive learning and knowledge distillation, respectively. Apart from classification, FSSL has been also used for the task of COVID-19 region segmentation in chest computed tomography scans [40].…”
Section: B Federated Learning With Unlabeled Datamentioning
confidence: 99%
“…In detail, the authors used two different data augmentation methods to transform the unlabeled data, and supervised the strongly augmented samples with sharpened low-entropy prediction on the weakly augmented samples. Since the semi-supervised learning problem also exists in the FL scenario, some current studies [13], [32] and [33] investigated to construct SSL framework for FL based applications. As an instance, Jeong et al in [13] presented an inter-client consistency loss and a disjoint learning pattern on labeled and unlabeled data.…”
Section: Related Workmentioning
confidence: 99%