2018
DOI: 10.3390/a11100159
|View full text |Cite
|
Sign up to set email alerts
|

A Faster Algorithm for Reducing the Computational Complexity of Convolutional Neural Networks

Abstract: Convolutional neural networks have achieved remarkable improvements in image and video recognition but incur a heavy computational burden. To reduce the computational complexity of a convolutional neural network, this paper proposes an algorithm based on the Winograd minimal filtering algorithm and Strassen algorithm. Theoretical assessments of the proposed algorithm show that it can dramatically reduce computational complexity. Furthermore, the Visual Geometry Group (VGG) network is employed to evaluate the a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 19 publications
(12 citation statements)
references
References 13 publications
0
12
0
Order By: Relevance
“…Winograd filtering reduces the number of multiplications at the cost of more additions. The filtering can be efficiently implemented with lookup tables and shifts in FPGA [49][50][51].…”
Section: Reduction Of Complexity Of Cnn Modelsmentioning
confidence: 99%
“…Winograd filtering reduces the number of multiplications at the cost of more additions. The filtering can be efficiently implemented with lookup tables and shifts in FPGA [49][50][51].…”
Section: Reduction Of Complexity Of Cnn Modelsmentioning
confidence: 99%
“…[29] and Zhao et al . [30], we have computed the computation complexity of our proposed two‐stage CNN. The complexity of the first stage is 904.5×108 and the second stage is 1.15×108.…”
Section: Results and Disscussionmentioning
confidence: 99%
“…Winograd filtering is a known technique to reduce the number of multiplications of a convolution. The technique was efficiently implemented on FPGA [147][148][149][150].…”
Section: Hardware-oriented Deep Neural Network Optimizationsmentioning
confidence: 99%