2023
DOI: 10.1049/ipr2.12746
|View full text |Cite
|
Sign up to set email alerts
|

Fused pyramid attention network for single image super‐resolution

Abstract: In image super‐resolution, deep neural networks with various attention mechanisms have achieved noticeable performance in recent years, for example, channel attention and layer attention. Although many researchers have achieved good super‐resolution results with only a certain style of attention, the divergence and the complementarity focused by multiple attention mechanisms are ignored. In addition, most of these methods fail to utilize the diverse information from multi‐scale features. To efficiently manipul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 64 publications
0
3
0
Order By: Relevance
“…Chen et al. [38] proposed an attention in attention network including both non attention and attention branches with a dynamic attention module used to adaptively generate weights for both branches based on the input features. Xia et al.…”
Section: Related Workmentioning
confidence: 99%
“…Chen et al. [38] proposed an attention in attention network including both non attention and attention branches with a dynamic attention module used to adaptively generate weights for both branches based on the input features. Xia et al.…”
Section: Related Workmentioning
confidence: 99%
“…We design a recursive self-attention module-based network (RSANet), which is a single branch network that is based on the concept of super-resolution [50]. We combine the PAN and the LRMS after up-sampling by a factor of 4 as the input to the network.…”
Section: Rsanetmentioning
confidence: 99%
“…Accordingly, many lightweight methods that incorporate attention mechanisms are constantly evolving. Hui et al 41 developed a contrast-aware CA mechanism at the end of the progressive refinement module to steadily boost the accuracy of image SR. Chen et al 42 proposed attention in an attention network, in which a non-attention branch and a coupling attention branch adjust the contribution of attention layers dynamically. Wan et al 43 built a hierarchical SA mechanism to jointly and adaptively fuse local and global hierarchical features for HR reconstruction.…”
Section: Lightweight Super-resolution Networkmentioning
confidence: 99%
“…Hui et al 41 . developed a contrast-aware CA mechanism at the end of the progressive refinement module to steadily boost the accuracy of image SR. Chen et al 42 . proposed attention in an attention network, in which a non-attention branch and a coupling attention branch adjust the contribution of attention layers dynamically.…”
Section: Related Workmentioning
confidence: 99%