2021
DOI: 10.3390/jimaging7110243
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Invertible Neural Networks for Medical Imaging

Abstract: Over recent years, deep learning methods have become an increasingly popular choice for solving tasks from the field of inverse problems. Many of these new data-driven methods have produced impressive results, although most only give point estimates for the reconstruction. However, especially in the analysis of ill-posed inverse problems, the study of uncertainties is essential. In our work, we apply generative flow-based models based on invertible neural networks to two challenging medical imaging tasks, i.e.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 34 publications
(20 citation statements)
references
References 36 publications
0
20
0
Order By: Relevance
“…Assuming that we have given N example patches p 1 , ..., p N and corresponding additional information c 1 , ..., c N , we now aim to approximate the underlying conditional distribution of the patches. To this end, we train a conditional patchNR [5,14,20]. For two random variables X p and Y c , a conditional NF aims to approximate the conditional distributions P Xp|Yc=c for all possible observations of c. Formally, a conditional NF is a learned mapping…”
Section: Conditional Patch Normalizing Flowsmentioning
confidence: 99%
“…Assuming that we have given N example patches p 1 , ..., p N and corresponding additional information c 1 , ..., c N , we now aim to approximate the underlying conditional distribution of the patches. To this end, we train a conditional patchNR [5,14,20]. For two random variables X p and Y c , a conditional NF aims to approximate the conditional distributions P Xp|Yc=c for all possible observations of c. Formally, a conditional NF is a learned mapping…”
Section: Conditional Patch Normalizing Flowsmentioning
confidence: 99%
“…It has the additional advantage that for very high-dimensional or complex datasets y, we can include a feature extraction network in the conditioning block and fully integrate it in the training process [2,3]. This allows to employ cINNs in image processing tasks [13,32]. introduction of machine learning for solving regression, classification and clustering problems has revolutionized scientific research, and in particular has provided e↵ective methods for analyzing big astronomical data [17,24].…”
Section: Invertible Neural Networkmentioning
confidence: 99%
“…It has the additional advantage that for very high-dimensional or complex datasets y, we can include a feature extraction network in the conditioning block and fully integrate it in the training process [2,3]. This allows to employ cINNs in image processing tasks [13,32]. The particular example depicted here consists of eight affine coupling blocks interchanged with permutation layers.…”
Section: Invertible Neural Networkmentioning
confidence: 99%
“…We train our INN architecture in such that each pixel-wise conditioning vector (PCC) is concatenated with high level features extracted from the entire image using an additional feed-forward network 𝐻 . For this task, we use a pre-trained VGG network, similar to [Ardizzone et al 2020;Denker et al 2021] as the feed-forward network. The final conditioning vector for a given pixel is given as c 𝑝 = [C(y 𝑝 ), 𝐻 (y)], where the weights of 𝐻 (•) are simultaneously being updated alongside the weights of the INN.…”
Section: Conditioning Vectormentioning
confidence: 99%