2018
DOI: 10.48550/arxiv.1811.02187
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Network-Hardware Co-design for Scalable RRAM-based BNN Accelerators

Yulhwa Kim,
Hyungjun Kim,
Jae-Joon Kim

Abstract: Recently, RRAM-based Binary Neural Network (BNN) hardware has been gaining interests as it requires 1-bit sense-amp only and eliminates the need for high-resolution ADC and DAC. However, RRAM-based BNN hardware still requires high-resolution ADC for partial sum calculation to implement large-scale neural network using multiple memory arrays. We propose a neural network-hardware co-design approach to split input to fit each split network on a RRAM array so that the reconstructed BNNs calculate 1-bit output neur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…Compared to the STT-BNN design at room temperature, the classification accuracy of the MLP slightly drops by up to 0.1% at 85 oC, whereas it drops more significantly by 3.41% for the CNN in view of its more complex task. Also, compared to the baseline BNN in [31], the classification accuracy of the MLP drops by up to 0.48% at 85 oC, it drops by 12% for the CNN. Overall, the effect of mismatch is markedly more pronounced than temperature.…”
Section: System-level Validation and Comparison With Prior Artmentioning
confidence: 94%
See 1 more Smart Citation
“…Compared to the STT-BNN design at room temperature, the classification accuracy of the MLP slightly drops by up to 0.1% at 85 oC, whereas it drops more significantly by 3.41% for the CNN in view of its more complex task. Also, compared to the baseline BNN in [31], the classification accuracy of the MLP drops by up to 0.48% at 85 oC, it drops by 12% for the CNN. Overall, the effect of mismatch is markedly more pronounced than temperature.…”
Section: System-level Validation and Comparison With Prior Artmentioning
confidence: 94%
“…Fig. 12 plots the energy per row-wise accumulation, and the inference accuracies as evaluated from two neural networks built in Torch7 [31], and mapped into the proposed STT-BNN macro similar to [8]. The first and simpler neural network is a multi-layer perceptron (MLP) for handwritten digit classification on the MNIST dataset [32], whose structure is detailed in Table II.…”
Section: Evaluation Of Energy Efficiency Of Stt-bnnmentioning
confidence: 99%
“…should be accumulated to complete the matrix computation. SOTA works showed that the precision of the partial sums significantly affects the accuracy of the computed BNNs [2], [10], [11], [16]. To obtain multi-bit partial sums in the SRAM CIM circuits, we need ADCs to produce multi-bit outputs, incurring large area and energy overhead.…”
Section: Software Implementationmentioning
confidence: 99%
“…To address this problem, the authors of [11] developed an input splitting technique, which we employed in this work. A large convolution or FC layer is reconstructed into several smaller groups, as shown in Fig.…”
Section: Software Implementationmentioning
confidence: 99%
See 1 more Smart Citation