2018
DOI: 10.1007/978-3-319-95957-3_23
|View full text |Cite
|
Sign up to set email alerts
|

Optimization Method of Residual Networks of Residual Networks for Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…The recurrent neural network (RNN) [21], long short-term memory (LSTM) [22,23], and gated recurrent unit (GRU) [24] are commonly used for one-dimensional signals such as time sequence signal [25] classification. In terms of two-dimensional image identification, the deep convolutional network [26,27], residual network (Resnet) [28,29], Densenet [30,31] and their improved structures have been widely used in image classification tasks. Ouahabi [32] used the deep-learning method to perform real-time semantic segmentation of ultrasonic acoustic images.…”
Section: Introductionmentioning
confidence: 99%
“…The recurrent neural network (RNN) [21], long short-term memory (LSTM) [22,23], and gated recurrent unit (GRU) [24] are commonly used for one-dimensional signals such as time sequence signal [25] classification. In terms of two-dimensional image identification, the deep convolutional network [26,27], residual network (Resnet) [28,29], Densenet [30,31] and their improved structures have been widely used in image classification tasks. Ouahabi [32] used the deep-learning method to perform real-time semantic segmentation of ultrasonic acoustic images.…”
Section: Introductionmentioning
confidence: 99%
“…Valeryi et al [26] proposed the technique of automatic labeling the data set with the finite element model for training of artificial neural network in tomography. Lin et al [15] proposed a residual networks of residual networks (RoR) optimization method to avoid over-fitting and gradient vanish. Gomez et al [6] found in the ResNet network along with the increase of the depth in improving performance, but increased memory consumption, thus put forward each layer's activations can be reconstructed exactly from the next layer's.…”
Section: Introductionmentioning
confidence: 99%