2017
DOI: 10.1007/s10586-017-1443-x
|View full text |Cite
|
Sign up to set email alerts
|

Pyramidal RoR for image classification

Abstract: Abstract:The Residual Networks of Residual Networks (RoR) exhibits excellent performance in the image classification task, but sharply increasing the number of feature map channels makes the characteristic information transmission incoherent, which losses a certain of information related to classification prediction, limiting the classification performance. In this paper, a Pyramidal RoR network model is proposed by analysing the performance characteristics of RoR and combining with the PyramidNet. Firstly, ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 35 publications
0
7
0
Order By: Relevance
“…The experiments showed that the classification performance of the proposed network achieved competitive performance on some public datasets, compared to conventional networks. It is worth mentioning that the classification accuracy can be further improved by applying the MSL to more advanced pyramidal residual networks [32, 33] which are currently state of the art on CIFAR‐10 (2.96%) and CIFAR‐100 (16.4%) datasets.…”
Section: Discussionmentioning
confidence: 99%
“…The experiments showed that the classification performance of the proposed network achieved competitive performance on some public datasets, compared to conventional networks. It is worth mentioning that the classification accuracy can be further improved by applying the MSL to more advanced pyramidal residual networks [32, 33] which are currently state of the art on CIFAR‐10 (2.96%) and CIFAR‐100 (16.4%) datasets.…”
Section: Discussionmentioning
confidence: 99%
“…Other heuristics have also been explored. For example, the pyramidal rule [23,24] suggested to gradually increase the channels in all convolutions layer by layer, regardless of spatial size. Figure 1 visually summarizes these heuristics for setting channel numbers in a neural network.…”
Section: Introductionmentioning
confidence: 99%
“…Beyond the macro-level heuristics across entire network, recent works [6,13,24,25,26] have also digged into channel configuration for micro-level building blocks (a network building block is usually composed of several 1 × 1 and 3 × 3 convolutions). These micro-level heuristics have led to better speed-accuracy trade-offs.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations