2022 International Conference on Cloud Computing, Big Data and Internet of Things (3CBIT) 2022
DOI: 10.1109/3cbit57391.2022.00068
|View full text |Cite
|
Sign up to set email alerts
|

The inference operation optimization of an improved LeNet-5 convolutional neural network and its FPGA hardware implementation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…In this section, we compare our work with other FPGA implementations targeting similar datasets and provide a reasonable summary of our implementation results, as shown in Table I. In the study [29], they reduced the scale of the LeNet-5 model using partial 1-bit quantization for convolution and pooling layers. They generated IP using HLS for deployment on an FPGA to recognize MNIST handwritten digits.…”
Section: Results and Analysismentioning
confidence: 99%
“…In this section, we compare our work with other FPGA implementations targeting similar datasets and provide a reasonable summary of our implementation results, as shown in Table I. In the study [29], they reduced the scale of the LeNet-5 model using partial 1-bit quantization for convolution and pooling layers. They generated IP using HLS for deployment on an FPGA to recognize MNIST handwritten digits.…”
Section: Results and Analysismentioning
confidence: 99%