2022
DOI: 10.1016/j.micpro.2022.104634
|View full text |Cite
|
Sign up to set email alerts
|

A novel framework for deployment of CNN models using post-training quantization on microcontroller

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 5 publications
0
2
0
Order By: Relevance
“…Mainstream microcontrollers are capable enough to perform machine learning classification [26] or even run deep neural network models; for example a Convolutional Neural Network (CNN) model trained with TensorFlow can be deployed on the Cortex-M4 microcontroller. Sailesh et al [27] provided a framework that automates code generation for a CNN model on a microcontroller. However, the limited memory on IoT devices restricts the utilization of CNNs in the IoT.…”
Section: Edge Computingmentioning
confidence: 99%
“…Mainstream microcontrollers are capable enough to perform machine learning classification [26] or even run deep neural network models; for example a Convolutional Neural Network (CNN) model trained with TensorFlow can be deployed on the Cortex-M4 microcontroller. Sailesh et al [27] provided a framework that automates code generation for a CNN model on a microcontroller. However, the limited memory on IoT devices restricts the utilization of CNNs in the IoT.…”
Section: Edge Computingmentioning
confidence: 99%
“…Early CNN models used 64/32-bit floating points. As researchers began to study accelerators rather than GPUs, floating points were converted to fixed points to reduce computational complexity [15]- [17], and interest in quantization technology that lowered precision increased [18][19][20]. Many CNN models showed that if the precision is lowered, the hardware complexity is significantly reduced, and the accuracy reduction is negligible or low [21], [22].…”
Section: Introductionmentioning
confidence: 99%