2019
DOI: 10.48550/arxiv.1907.13418
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Uncertainty Quantification in Deep Learning for Safer Neuroimage Enhancement

Ryutaro Tanno,
Daniel Worrall,
Enrico Kaden
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
10
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 100 publications
1
10
0
Order By: Relevance
“…ifications to a standard DNN. As shown in prior work (e.g., [3,5]), the final experiment demonstrates how having an estimate of uncertainty adds insight into a quantitative result (see Fig. 4).…”
Section: Discussionsupporting
confidence: 65%
See 1 more Smart Citation
“…ifications to a standard DNN. As shown in prior work (e.g., [3,5]), the final experiment demonstrates how having an estimate of uncertainty adds insight into a quantitative result (see Fig. 4).…”
Section: Discussionsupporting
confidence: 65%
“…Furthermore, capturing one type of uncertainty but not the other is insufficient to estimate the predictive uncertainty-an encompassing measure which describes how well any voxel can be predicted. Uncertainty estimation in image translation, segmentation, and super-resolution has been explored [3,4,5]; however, in this work, we verify that the epistemic and aleatoric uncertainty estimates captured in an image translation task align with the definitions of the terms.…”
Section: Introductionmentioning
confidence: 58%
“…by means of ensembling of networks [42] or variational dropout [43]. In addition, previous work by Kendall and Gal [44], Tanno et al [45] has shown that the quality of uncertainty estimates can be improved if model (epistemic) and data (aleatoric) uncertainty are assessed simultaneously with separate measures. The current study focused on the assessment of model uncertainty by means of MC-dropout and entropy which is a combination of epistemic and aleatoric uncertainty.…”
Section: Discussionmentioning
confidence: 99%
“…In previous studies, the pixel-wise aleatoric uncertainty can be generated by incorporating Gaussian negative log likelihood (NLL) loss into neural network [28], and applied in MR super-resolution reconstruction [26] [27]. Such uncertainty only represents the uncertainty from data which can not be prevented and it is not the main issue when applying the deep learning based MRI restoration in the clinic practices.…”
Section: E Uncertaintymentioning
confidence: 99%
“…At last, doctors usually concern about the accuracy of restored high-quality images. Tanno et al [26] and Qin et al [27] employed a method to generate aleatoric uncertainty [29] as auxiliary information for doctor to understand the uncertainty of restored MR image. However, such method can not tell if the uncertainty is caused by the noise in the training data or caused by error generated by the deep neural network due to out of distribution (OOD) data, i.e., distribution of training data is not identical to the distribution of test data which is commonly seen in the real clinical environment.…”
Section: Introductionmentioning
confidence: 99%