2018
DOI: 10.1080/2150704x.2018.1492170
|View full text |Cite
|
Sign up to set email alerts
|

SAR image despeckling using a dilated densely connected network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…In addition, skip connections help reduce the vanishing gradient problem. Along the same line, Gui et al [37] used dilated convolution and residual learning with a densely connected network. In addition, Li et al [38] relied on dilated convolution and residual training, the main innovation being the use of a convolutional block attention module to enhance representation power and performance.…”
Section: Related Workmentioning
confidence: 99%
“…In addition, skip connections help reduce the vanishing gradient problem. Along the same line, Gui et al [37] used dilated convolution and residual learning with a densely connected network. In addition, Li et al [38] relied on dilated convolution and residual training, the main innovation being the use of a convolutional block attention module to enhance representation power and performance.…”
Section: Related Workmentioning
confidence: 99%
“…Even though ablation studies seem to support the importance of attention modules, in general, it is not obvious how spatial attention, in particular, helps achieving a better image restoration. SAR-DDCN, proposed in [53] is conceptually similar to SAR-DRN, the main innovation being the introduction of two 5-layer dense blocks. Dense connections [79] are well known to allow a better propagation of features in very deep networks and hence reduce the vanishing gradient problem, a major issue for the training of very deep CNNs.…”
Section: A Direct Dl-based Despecklingmentioning
confidence: 99%
“…Indeed, the explicit goal of the method, based on the analysis of previous literature, is to use a deeper architecture to extract more expressive features. The computational load is reduced by means of simplified dense connections: 5-layer blocks are used as in [53], but only the last layer in the block receives inputs from all preceding layers. Then, the same structure is replicated at the block level, thus constructing a hierarchical multiconnection network.…”
Section: A Direct Dl-based Despecklingmentioning
confidence: 99%
“…Zhang [39] combined skip connection [40] and dilated convolution [41] to achieve SAR despeckling. Similarly, Gui [42] proposed a network using dilated convolution and a densely connected network [43]. Lattari [44] successfully used the U-Net CNN architecture for SAR image speckle suppression.…”
Section: Introductionmentioning
confidence: 99%