2023 IEEE 10th International Conference on Data Science and Advanced Analytics (DSAA) 2023
DOI: 10.1109/dsaa60987.2023.10302579
|View full text |Cite
|
Sign up to set email alerts
|

Solving Inverse Problems in Compressive Imaging with Score-Based Generative Models

Zhen Yuen Chong,
Yaping Zhao,
Zhongrui Wang
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
8
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 20 publications
0
8
0
Order By: Relevance
“…Recently, generative model has drawn considerable attention in the field of MRI. [20][21][22][23][24][25] Leveraging distribution prior, generative models exhibit promise in enhancing acceleration while retaining intricate details. The data distribution can be learned implicitly by directly representing the sampling process like generative adversarial network (GAN) 26 or explicitly by representing the probability density/mass like Bayesian networks 27 and score-based generative model 28 that estimates the gradients of the data distribution.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, generative model has drawn considerable attention in the field of MRI. [20][21][22][23][24][25] Leveraging distribution prior, generative models exhibit promise in enhancing acceleration while retaining intricate details. The data distribution can be learned implicitly by directly representing the sampling process like generative adversarial network (GAN) 26 or explicitly by representing the probability density/mass like Bayesian networks 27 and score-based generative model 28 that estimates the gradients of the data distribution.…”
Section: Introductionmentioning
confidence: 99%
“…Diffusion-based image generative models are increasingly popular for synthesis of medical imaging. Importantly, diffusion models can be relatively easily extended to enforce data consistency during the sampling process [25], [26], [27].…”
mentioning
confidence: 99%
“…Diffusion image generation models (Fig. 1.A) approximate the score term with a neural network 𝛻 𝜇 𝑡 𝑙𝑜𝑔 𝑝 𝑡 (𝜇 𝑡 ) ≈ 𝑠 𝜃 (𝜇 𝑡 , 𝑡) [25], [26]. The parameters (𝜃) of the score-matching network 𝑠 𝜃 are obtained by training an unsupervised synthesis using a dataset assumed to be representative of the underlying noise-free image distribution 𝑝(𝜇 0 ).…”
mentioning
confidence: 99%
See 2 more Smart Citations