ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8683521
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Data-driven Hardware Resilience to Efficiently Train Inference Models for Stochastic Hardware Implementations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 12 publications
0
6
0
Order By: Relevance
“…In NI-based training, Gaussian noise N(0, σ 2 NI ) is added to network weights during the feed-forward pass of the back-propagation iterations during training. NI-based training has been observed to improve robustness to line resistance [8], and to device mismatch and conductance variation in [13,19]. We studied the impact of noise injection, by training networks with different values of σ NI , ranging from 0% to 10%.…”
Section: Robustness Improvements In Dnn Implementationsmentioning
confidence: 99%
See 2 more Smart Citations
“…In NI-based training, Gaussian noise N(0, σ 2 NI ) is added to network weights during the feed-forward pass of the back-propagation iterations during training. NI-based training has been observed to improve robustness to line resistance [8], and to device mismatch and conductance variation in [13,19]. We studied the impact of noise injection, by training networks with different values of σ NI , ranging from 0% to 10%.…”
Section: Robustness Improvements In Dnn Implementationsmentioning
confidence: 99%
“…For example, the achievable inference accuracy of crossbar architectures is limited by IR drop, device non-linearity, thermal noise, process variations, stuck-atfaults, write noise, and limited device endurance. A number of works have addressed these challenges such as [8] (IR drop), [19] (device variations), and [13] (conductance variations). Recently, onchip training methods have been proposed [16] to minimize the impact of die-specific variations.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The crossbar is assumed to use for inference in this paper. However, the non-ideal behavior of the device in the inference stage may significantly decrease the application level accuracy [38], which prevent the use of the emerging devices crossbar. In this work, we propose to use a modified training method to alleviate the impact of non-ideal behavior of the device and circuit, as shown in the right component of Figure 3 ➂.…”
Section: Cross-layer Exploration Frameworkmentioning
confidence: 99%
“…machine learning, to enable reduced SNR by incorporating models of the computational noise, in both chip-specific [16]- [18] and chip-generalized [19] training algorithms. This has shown promise, warranting the further research needed to transition such models toward generalized abstractions, suitable for application design and mapping across the range of hardware design parameters and operating conditions.…”
mentioning
confidence: 99%