2020
DOI: 10.1007/978-3-030-59710-8_53
|View full text |Cite
|
Sign up to set email alerts
|

Double-Uncertainty Weighted Method for Semi-supervised Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
47
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 81 publications
(47 citation statements)
references
References 7 publications
0
47
0
Order By: Relevance
“…Tabel Table 4 summarizes the quantitative results of our proposed method and several state-of-the-art methods, including LG-ER-MT [ 27 ], DUWM [ 28 ], MC-Net [ 29 ], V-net [ 30 ], Bayesian V-net, and AJSQnet [ 31 ]. Among them, LG-ER-MT, DUWM, and MC-Net utilized the semi-supervised strategy with uncertainty prediction, while V-net, Bayesian V-net, and AJSQnet are trained by all labeled data, and Bayesian V-Net utilized the Bayesian network to adapt the vanilla V-Net.…”
Section: Resultsmentioning
confidence: 99%
“…Tabel Table 4 summarizes the quantitative results of our proposed method and several state-of-the-art methods, including LG-ER-MT [ 27 ], DUWM [ 28 ], MC-Net [ 29 ], V-net [ 30 ], Bayesian V-net, and AJSQnet [ 31 ]. Among them, LG-ER-MT, DUWM, and MC-Net utilized the semi-supervised strategy with uncertainty prediction, while V-net, Bayesian V-net, and AJSQnet are trained by all labeled data, and Bayesian V-Net utilized the Bayesian network to adapt the vanilla V-Net.…”
Section: Resultsmentioning
confidence: 99%
“…Uncertainties for image segmentation are derived from general considerations of the statistical model, from resampling training data sets in ensemble approaches [36], or from modifications like Monte Carlo dropout of the predictive procedure [37]. For semi-supervised learning, the uncertainty can be used to judge whether the model provides accurate and confident prediction, which can be leveraged to further exploit the unlabeled data and has been applied to many semi-supervised medical image segmentation tasks [14], [15]. Wang et al [38] found that Monte Carlo dropout perform better for uncertainty estimation.…”
Section: Uncertainty Estimationmentioning
confidence: 99%
“…To demonstrate the effectiveness of our method, a comprehensive comparison with existing methods is conducted on LA dataset. We evaluate our method with comparisons to several recent stateof-the-art semi-supervised segmentation methods, including transformation-consistent self-ensembline (TCSE) [34], mean teacher (MT) [13], uncertainty-aware mean teacher (UA-MT) [14], entropy minimization (Entropy Mini) [46], shape-aware semi-supervised segmentation (SASS) [17], double uncertainty weighted method (DUWM) [15], dual task consistency learning (DTC) [18] and uncertainty rectified pyramid consistency (URPC) [35]. To ensure a fair comparison, we used the same network backbone in these methods.…”
Section: Performance Of Using Different Percentages Of Labeled Datamentioning
confidence: 99%
See 1 more Smart Citation
“…The first stage aims to generate high-quality pseudo labels, and the second stage aims to use pseudo labels to retrain the network to regularize features for both labeled and unlabeled images. Existing uncertainty-based semi-supervised methods [5,13,14] have achieved stunning results by considering the reliability of the supervision for the unlabeled images. These methods exploit the epistemic uncertainty, a kind of uncertainty about the model's parameters arising from a lack of data, either in the output space [5,13,14] or in the feature space [14], as guidance for identifying trustworthy supervision.…”
Section: Introductionmentioning
confidence: 99%