2020
DOI: 10.1007/978-3-030-59710-8_59
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised Medical Image Classification with Global Latent Mixing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
27
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 36 publications
(27 citation statements)
references
References 7 publications
0
27
0
Order By: Relevance
“…As such, in our case, any unlabeled data point x u is a random sample from the combination of both datasets X l and X u . This way of guessing the label encourages the model to be consistent across different augmentations [Gyawali et al 2020]. Although following standard practice, we do not propagate gradients by computing the guessed labels; it should be noted that the guessed labels may change, as the segmentation network f is updated over training.…”
Section: Guessing Labelsmentioning
confidence: 99%
“…As such, in our case, any unlabeled data point x u is a random sample from the combination of both datasets X l and X u . This way of guessing the label encourages the model to be consistent across different augmentations [Gyawali et al 2020]. Although following standard practice, we do not propagate gradients by computing the guessed labels; it should be noted that the guessed labels may change, as the segmentation network f is updated over training.…”
Section: Guessing Labelsmentioning
confidence: 99%
“…As such, in our case, any unlabeled data point 𝑥 𝑢 is a random sample from the combination of both datasets X 𝑙 and X 𝑢 . This way of guessing the label encourages the model to be consistent across different augmentations [Gyawali et al 2020]. Although following standard practice, we do not propagate gradients by computing the guessed labels; it should be noted that the guessed labels may change, as the segmentation network 𝑓 is updated over training.…”
Section: Ssl 𝐷 : Ssl With Domain-specific Augmentationmentioning
confidence: 99%
“…To address this problem, a naive solution is to simply integrate the off-the-rack semi-supervised learning (SSL) methods onto the federated learning paradigm. However, previous SSL methods are typically designed for centralized training setting [2,10,17,29], which rely heavily on the assumption that the labeled data is accessible to provide necessary assistance for the learning from unlabeled data [1,4]. In consistency-based methods [5,31], for instance, the regularization of perturbation-invariant model predictions needs the synchronous labeled data supervision, in order to obtain the necessary task knowledge to produce reliable model predictions for unlabeled data where the consistency regularization is imposed on.…”
Section: Introductionmentioning
confidence: 99%