2018
DOI: 10.1016/j.neucom.2018.02.027
|View full text |Cite
|
Sign up to set email alerts
|

High-frequency details enhancing DenseNet for super-resolution

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 27 publications
(10 citation statements)
references
References 13 publications
0
10
0
Order By: Relevance
“…Traditional neural networks to improve image quality are convolutional neural network (CNN) and generative adversarial network (GAN). One resolution improvement technique known as super-resolution CNN (SRCNN) estimates unsharpness through the trained neural network and restores the resolution of the blurred input image 8,[10][11][12][13][14][15][16][17][18] . As CPU and GPU technologies develop, very deep convolution layers of neural networks can be calculated.…”
Section: Deep Learning For High-resolution and High-sensitivity Intermentioning
confidence: 99%
“…Traditional neural networks to improve image quality are convolutional neural network (CNN) and generative adversarial network (GAN). One resolution improvement technique known as super-resolution CNN (SRCNN) estimates unsharpness through the trained neural network and restores the resolution of the blurred input image 8,[10][11][12][13][14][15][16][17][18] . As CPU and GPU technologies develop, very deep convolution layers of neural networks can be calculated.…”
Section: Deep Learning For High-resolution and High-sensitivity Intermentioning
confidence: 99%
“…This means that enhancing bird shape and shallow image features is helpful to improve the classification performance of CNN models. Research results on attention mechanism techniques confirm that reinforcing image features in the target region brings a positive effect on model performance [ 24 , 27 , 28 ], while the success of deep dense networks also shows the positive effect of reusing shallow network features [ 29 , 30 , 31 , 32 , 33 ].…”
Section: Methodsmentioning
confidence: 99%
“…While increasing the depth of the convolutional neural network had its benefits, there existed the challenge of vanishing information about the gradient or the input when going through the layers [29]. The authors proposed a simple connectivity-based architecture to facilitate the maximum flow of information through the layers through both the forward computation and the backward computation.…”
Section: Densenetmentioning
confidence: 99%