2018
DOI: 10.1007/978-3-319-73603-7_10
|View full text |Cite
|
Sign up to set email alerts
|

Convolution with Logarithmic Filter Groups for Efficient Shallow CNN

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…Deformable convolution is able to only focus on what they are interested in, making the feature maps are more representative. Moreover, there exist a variety of awesome convolutions, such as Separable convolutions [24], [25], [26], [27], [28], group convolutions [11], [29], [30], [31] and multi-dimensional convolutions, which are discussed in Section 3 and Section 5.…”
Section: Brief Overview Of Cnnmentioning
confidence: 99%
“…Deformable convolution is able to only focus on what they are interested in, making the feature maps are more representative. Moreover, there exist a variety of awesome convolutions, such as Separable convolutions [24], [25], [26], [27], [28], group convolutions [11], [29], [30], [31] and multi-dimensional convolutions, which are discussed in Section 3 and Section 5.…”
Section: Brief Overview Of Cnnmentioning
confidence: 99%
“…To further improve the segmentation accuracy, shuffle [ 11 ] and group [ 12 ] convolution were added to the segmentation architecture. By integrating the group convolution, segmentation features can be learned in a block way, where the filters with high correlation can be trained efficiently [ 13 ]. A shuffle operation was also added to overcome the issue of limited training variations, because only a small fraction of the input channel is utilized in standard group convolution.…”
Section: Introductionmentioning
confidence: 99%
“…Compared to single SCNN, the architecture uses the combinations of CNN and SVM to achieve higher classification result. Lee et al [9] introduces logarithmic group convolution module with 1×1 and 3×3 convolutions to reduce the size of model in SCNN. In [10]- [12], the authors apply batch normalization (BN) technology [13] to SCNN to speed up training and improve accuracy.…”
Section: Introductionmentioning
confidence: 99%