2019
DOI: 10.1007/978-3-030-21005-2_23
|View full text |Cite
|
Sign up to set email alerts
|

Strided Convolution Instead of Max Pooling for Memory Efficiency of Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 44 publications
(25 citation statements)
references
References 9 publications
0
25
0
Order By: Relevance
“…In consecutive layers, features are further reduced to the dimensions of 17 × 40, 17 × 20 and 17 × 10. Feature reduction with strides is an alternative to pooling layers in CNNs [35,36].…”
Section: Cnn Modelmentioning
confidence: 99%
“…In consecutive layers, features are further reduced to the dimensions of 17 × 40, 17 × 20 and 17 × 10. Feature reduction with strides is an alternative to pooling layers in CNNs [35,36].…”
Section: Cnn Modelmentioning
confidence: 99%
“…In addition to the convolutional layers, other changes were suggested to stabilize the GAN's training. Replacing the pooling layers by strided convolution has shown better performance [134,6]. Therefore, it is proposed to use strided convolutions in both G and D.…”
Section: Deep Convolutional Gan (Dcgan)mentioning
confidence: 99%
“…On the other hand, in reference to downsampling, strided convolution layers were chosen over classical pooling because of the advantage it provides having trainable parameters as opposed to the fixed operation the second performs. In this manner, the network can learn to summarize the data, thus improving its accuracy (Ayachi et al, 2018).…”
Section: Architecturementioning
confidence: 99%