2022
DOI: 10.1007/s00371-022-02599-8
|View full text |Cite
|
Sign up to set email alerts
|

Unpaired low-dose CT denoising via an improved cycle-consistent adversarial network with attention ensemble

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 55 publications
0
2
0
Order By: Relevance
“…Thus, we can estimate the target values (i.e., denoised content) from repeated noisy measurements (i.e., noisy multi frame images). The attention gate was applied to image restoration tasks, such as denoising [68]- [70], superresolution [71], inpainting [72], dehazing [73], and image enhancement [74]. It improved the reconstruction performance by employing useful features and ignoring irrelevant features, and it effectively suppressed noise when the attention gate was included [69].…”
Section: Discussionmentioning
confidence: 99%
“…Thus, we can estimate the target values (i.e., denoised content) from repeated noisy measurements (i.e., noisy multi frame images). The attention gate was applied to image restoration tasks, such as denoising [68]- [70], superresolution [71], inpainting [72], dehazing [73], and image enhancement [74]. It improved the reconstruction performance by employing useful features and ignoring irrelevant features, and it effectively suppressed noise when the attention gate was included [69].…”
Section: Discussionmentioning
confidence: 99%
“…The second is to improve the image quality by training the U-Net-based final denoiser using the pre-trained MSAU-Net in the first training step. Yin et al [41] proposed improved Cycle-Consistent Adversarial Networks (CycleGAN) for LDCT image denoising that included a perceptual loss, and a generator network based on U-Net, in which the skip connection between the encoder and decoder path has been reconstructed.…”
Section: Background and Related Workmentioning
confidence: 99%