2021
DOI: 10.48550/arxiv.2103.06030
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FedDG: Federated Domain Generalization on Medical Image Segmentation via Episodic Learning in Continuous Frequency Space

Abstract: Federated learning allows distributed medical institutions to collaboratively learn a shared prediction model with privacy protection. While at clinical deployment, the models trained in federated learning can still suffer from performance drop when applied to completely unseen hospitals outside the federation. In this paper, we point out and solve a novel problem setting of federated domain generalization (FedDG), which aims to learn a federated model from multiple distributed source domains such that it can … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 57 publications
0
7
0
Order By: Relevance
“…The others design heuristic schemes, such as manipulating deep neural network architectures [37], leveraging self-supervision signals [13], designing novel training paradigm [38], [39], or conducting data augmentation [14], [40]. Very recently, with the rise of model-agnostic metalearning (MAML) [19], an paradigm of episodic learning is introduced to address DG [16], [17], [41]- [43]. The principle behind this is to simulate virtual meta-task episodically so that the knowledge of maintaining generalization under domain shift can be transferred to the real task of DG.…”
Section: Domain Generalizationmentioning
confidence: 99%
See 3 more Smart Citations
“…The others design heuristic schemes, such as manipulating deep neural network architectures [37], leveraging self-supervision signals [13], designing novel training paradigm [38], [39], or conducting data augmentation [14], [40]. Very recently, with the rise of model-agnostic metalearning (MAML) [19], an paradigm of episodic learning is introduced to address DG [16], [17], [41]- [43]. The principle behind this is to simulate virtual meta-task episodically so that the knowledge of maintaining generalization under domain shift can be transferred to the real task of DG.…”
Section: Domain Generalizationmentioning
confidence: 99%
“…Simple data augmentation as random flipping is applied. Following the common practice in DG [13], [16], [17], we use leave-one-domain-out cross-validation, i.e., training on multiple source domains and testing on the remaining unseen one, with randomly dividing the samples of each domain into 70% training data and 30% testing data. To evaluate the performance on epithelium-stroma classification, we adopt the accuracy metric as ACC = Nc Nt , where N c and N t represents the number of correctly classified images and total number of images, respectively.…”
Section: A Dataset and Evaluation Metricsmentioning
confidence: 99%
See 2 more Smart Citations
“…Federated learning (FL) [5,6], targeting at training machine learning models with data distributed in multiple local systems to resolve the information-silo problem, has shown its advantage in enhancing the performance and generalizability of the collaboratively trained models without the need of sharing any actual data. For example, FL has been devised in computer vision (CV) and natural language processing (NLP) to allow the joint training of powerful and generalizable deep convolutional neural networks and language models on separately stored datasets of images and texts [7,8,9,10,11].…”
Section: Introductionmentioning
confidence: 99%