2016 IEEE International Conference on Consumer Electronics-Asia (ICCE-Asia) 2016
DOI: 10.1109/icce-asia.2016.7804818
|View full text |Cite
|
Sign up to set email alerts
|

Measuring error-tolerance in SRAM architecture on hardware accelerated neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 4 publications
0
3
0
Order By: Relevance
“…Austin et al [7] discovered a correlation between weight error and classification accuracy for CNN and MLP. Kwon et al [8] introduced Gaussian noise to the weight parameters of LeNet and observed varying error tolerances among convolutional layers. However, the means of error injection and analysis methods for neural networks in these studies are relatively simple and didn't analyze the effect of SEU on neural networks by building models.…”
Section: Introductionmentioning
confidence: 99%
“…Austin et al [7] discovered a correlation between weight error and classification accuracy for CNN and MLP. Kwon et al [8] introduced Gaussian noise to the weight parameters of LeNet and observed varying error tolerances among convolutional layers. However, the means of error injection and analysis methods for neural networks in these studies are relatively simple and didn't analyze the effect of SEU on neural networks by building models.…”
Section: Introductionmentioning
confidence: 99%
“…Since not all the errors are critical for a CNN, it is crucial to accurately identify the critical layers of the CNN to balance the utilization of the TMR hardware resources and the hardening effectiveness. In [16], they tested the robustness of the CNN architecture via injecting Gaussian noise into the weights of the CNN and concluded that the error tolerance of a layer tended to worsen as it moved toward the output layer. However, the sensitivity of weight-related SEUs in different layers is influenced by various factors such as the number of weights, the weight value distribution, the network structure, and the layer position.…”
Section: Introductionmentioning
confidence: 99%
“…For example, [22] has evaluated the reliability of a CNN with 4 convolutional layers with bit flips on weights and showed that CNNs with large kernels are more robust than those with smaller kernels. In [23], LeNet-5 has been evaluated with Gaussian white noise on the weights, and it has been found that errors on the layers closer to the output layer have a larger effect on the network performance. The effects of errors on weights for different layers in LeNet-5 have been studied in [24]; convolution layers are more error resilient than full connection layers.…”
Section: I Introductionmentioning
confidence: 99%