2022
DOI: 10.3390/s22124318
|View full text |Cite
|
Sign up to set email alerts
|

Design Space Exploration of a Sparse MobileNetV2 Using High-Level Synthesis and Sparse Matrix Techniques on FPGAs

Abstract: Convolution Neural Networks (CNNs) are gaining ground in deep learning and Artificial Intelligence (AI) domains, and they can benefit from rapid prototyping in order to produce efficient and low-power hardware designs. The inference process of a Deep Neural Network (DNN) is considered a computationally intensive process that requires hardware accelerators to operate in real-world scenarios due to the low latency requirements of real-time applications. As a result, High-Level Synthesis (HLS) tools are gaining p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 19 publications
0
1
0
Order By: Relevance
“…The original paper on MobileNetV2 [8] does not provide a formal architectural graph of the model; however, in a related study [10], a comprehensive architectural graph of the model has been made available.…”
Section: Description Of Proposed Solution/designmentioning
confidence: 99%
“…The original paper on MobileNetV2 [8] does not provide a formal architectural graph of the model; however, in a related study [10], a comprehensive architectural graph of the model has been made available.…”
Section: Description Of Proposed Solution/designmentioning
confidence: 99%
“…Therefore, due to strict power and performance requirements it is very challenging to realize different digital signal processing algorithms into efficient VLSI design [19]. Designers are required to meet several constraints on power consumption, clock rate, die area, cost, and reconfigurability while also achieving balanced trade-offs between the above characteristics [20,21].…”
Section: Introductionmentioning
confidence: 99%