2022
DOI: 10.48550/arxiv.2204.04218
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multimodal Multi-Head Convolutional Attention with Various Kernel Sizes for Medical Image Super-Resolution

Abstract: Super-resolving medical images can help physicians in providing more accurate diagnostics. In many situations, computed tomography (CT) or magnetic resonance imaging (MRI) techniques output several scans (modes) during a single investigation, which can jointly be used (in a multimodal fashion) to further boost the quality of super-resolution results. To this end, we propose a novel multimodal multi-head convolutional attention module to super-resolve CT and MRI scans. Our attention module uses the convolution … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 41 publications
(235 reference statements)
0
2
0
Order By: Relevance
“…The attention mechanism [54] has been regarded as an advanced technique to capture long-range dependences. So, some additional complications of the neural network give the possibility to overcome the limitations of the kernels to determine and consider long-range dependences between pixels [55][56][57].…”
Section: Discussionmentioning
confidence: 99%
“…The attention mechanism [54] has been regarded as an advanced technique to capture long-range dependences. So, some additional complications of the neural network give the possibility to overcome the limitations of the kernels to determine and consider long-range dependences between pixels [55][56][57].…”
Section: Discussionmentioning
confidence: 99%
“…The PSNR is the ratio between the peak value power and noise power [34]. The PSNR is the most commonly used evaluation index to measure the quality of the lossy transformation reconstruction.…”
Section: Evaluation Metricsmentioning
confidence: 99%