2023
DOI: 10.1109/tmm.2022.3214780
|View full text |Cite
|
Sign up to set email alerts
|

T-Net: Deep Stacked Scale-Iteration Network for Image Dehazing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(4 citation statements)
references
References 71 publications
0
4
0
Order By: Relevance
“…Our method is compared with 10 state-of-the-art methods: DehazeNet 16 (2016), AOD-Net 17 (2017), GFN 19 (2018), DCPDN 18 (2018), EPDN 20 (2019), Y-Net 22 (2020), Domain Adaptation Dehazing Network (DADN) 53 (2020), Stack T-Net 54 (2021), FSAD-Net 55 (2022), and GP2P+ 56 (2022).…”
Section: Methodsmentioning
confidence: 99%
“…Our method is compared with 10 state-of-the-art methods: DehazeNet 16 (2016), AOD-Net 17 (2017), GFN 19 (2018), DCPDN 18 (2018), EPDN 20 (2019), Y-Net 22 (2020), Domain Adaptation Dehazing Network (DADN) 53 (2020), Stack T-Net 54 (2021), FSAD-Net 55 (2022), and GP2P+ 56 (2022).…”
Section: Methodsmentioning
confidence: 99%
“…In addition, considering that the features of different scales may not be equally important, we use a simple weighted attention fusion strategy to achieve feature fusion from the different row and column dimensions. Inspired by [79,62], we first generate two trainable weights for different features, where each parameter is an n-dimensional vector (n is the channels of feature). We add these weighted features to derive the fusion features.…”
Section: Gridformer Architecturementioning
confidence: 99%
“…Previous works [18,78,36,75,79] have shown that using dense connections has many advantages, mitigating the vanishing gradient problem, encouraging feature reuse and enhancing information propagation. Accordingly, we propose to design the transformer with dense connections to build the basic GridFormer layers.…”
Section: Residual Dense Transformer Blockmentioning
confidence: 99%
See 1 more Smart Citation