2018 Design, Automation &Amp; Test in Europe Conference &Amp; Exhibition (DATE) 2018
DOI: 10.23919/date.2018.8342151
|View full text |Cite
|
Sign up to set email alerts
|

Accurate neuron resilience prediction for a flexible reliability management in neural network accelerators

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 54 publications
(23 citation statements)
references
References 13 publications
0
23
0
Order By: Relevance
“…In contrast, since thread-level ABFT generates checksums across small, thread-level matrix multiplications, it can detect one fault per thread in the kernel, resulting in higher overall fault tolerance guarantees. As prior work has illustrated that different layers in NNs are more susceptible to faults than others [52,71], one might also consider varying the approach to ABFT used depending on the fault tolerance requirement of a given layer.…”
Section: Discussion 71 Additional Opportunities For Adaptive Abftmentioning
confidence: 99%
“…In contrast, since thread-level ABFT generates checksums across small, thread-level matrix multiplications, it can detect one fault per thread in the kernel, resulting in higher overall fault tolerance guarantees. As prior work has illustrated that different layers in NNs are more susceptible to faults than others [52,71], one might also consider varying the approach to ABFT used depending on the fault tolerance requirement of a given layer.…”
Section: Discussion 71 Additional Opportunities For Adaptive Abftmentioning
confidence: 99%
“…In this work, we make use of the framework presented in [19], where different methods to identify relevant input features in CNNs are implemented. This relevance is defined as the contribution of a certain pixel in the overall prediction of the network [18]. Hereby, we extend this concept to identify network elements where additional approximation can be applied with negligible CNN accuracy degradation.…”
Section: B Kernel-wise Optimizationmentioning
confidence: 99%
“…To select these resilient kernels, we propose a novel approach. This solution is based on understandable neural networks research [18], [19]. The focus of this research is to determine the relationships between neural network predictions and input feature maps.…”
Section: B Kernel-wise Optimizationmentioning
confidence: 99%
“…Previous work [26,33] incorporates circuit-level error detection techniques with architectural error bypassing/masking to mitigate the effect of timing and SRAM errors in voltage underscaled DNN accelerators. The irregular resilience behavior of DNN weights and activations is noted in [3,27], and the described characteristic is utilized to improve the resilience of neural network processing systems through reliability-aware resource allocation techniques. An error-correcting output coding scheme is utilized in [19] to enhance the self-correcting capability of neural networks.…”
Section: Related Workmentioning
confidence: 99%
“…While DNNs are known to exhibit a noticeable immunity against imprecision and small numerical perturbations, the numerical influence of bit-errors could manifest in quite dramatic terms in digital systems due to the asymmetric weight of the bits in the number representations. The consequent degradation in the accuracy of DNN processing systems has been established in the literature on the topic [3,4,17,21,[25][26][27]33]; additional reliability measures are called for therefore, particularly for critical applications.…”
Section: Introductionmentioning
confidence: 99%