2024
DOI: 10.3390/s24113560
|View full text |Cite
|
Sign up to set email alerts
|

A Residual Dense Attention Generative Adversarial Network for Microscopic Image Super-Resolution

Sanya Liu,
Xiao Weng,
Xingen Gao
et al.

Abstract: With the development of deep learning, the Super-Resolution (SR) reconstruction of microscopic images has improved significantly. However, the scarcity of microscopic images for training, the underutilization of hierarchical features in original Low-Resolution (LR) images, and the high-frequency noise unrelated with the image structure generated during the reconstruction process are still challenges in the Single Image Super-Resolution (SISR) field. Faced with these issues, we first collected sufficient micros… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…However, one of the major challenges of deep learning-based algorithms is high-frequency detail preservation. Numerous studies have proposed diverse algorithms to address this challenge, including residual learning [4,5], recursive structures [6][7][8], dense connections [9][10][11], and multi-path learning [12,13]. In recent times, attention-based algorithms have gained prominence, notably after the popularity of Transformer-based algorithms.…”
Section: Introductionmentioning
confidence: 99%
“…However, one of the major challenges of deep learning-based algorithms is high-frequency detail preservation. Numerous studies have proposed diverse algorithms to address this challenge, including residual learning [4,5], recursive structures [6][7][8], dense connections [9][10][11], and multi-path learning [12,13]. In recent times, attention-based algorithms have gained prominence, notably after the popularity of Transformer-based algorithms.…”
Section: Introductionmentioning
confidence: 99%