2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS) 2017
DOI: 10.1109/mwscas.2017.8052915
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning binary neural network on an FPGA

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(16 citation statements)
references
References 4 publications
0
8
0
Order By: Relevance
“…[53] 86.0 ResNet-18 [64] 86.05 9 256-ch conv. [61] 86.06 5 conv. and 2 FC [65] 86.78 NK [35] 86.98 C64-MP-2C128-MP-2C256-2FC512-FC10 [51] 87 NK [34] 87.16 AlexNet R1 regularizer [34] 87.30 AlexNet R2 regularizer [38] 87.73 BNN +1 padding [32] 88 BNN 512 channels for FC [38] 88.42 BNN 0 padding [59] 88.47 6 conv.…”
Section: Source Accuracy (%) Topology Precisionmentioning
confidence: 99%
See 1 more Smart Citation
“…[53] 86.0 ResNet-18 [64] 86.05 9 256-ch conv. [61] 86.06 5 conv. and 2 FC [65] 86.78 NK [35] 86.98 C64-MP-2C128-MP-2C256-2FC512-FC10 [51] 87 NK [34] 87.16 AlexNet R1 regularizer [34] 87.30 AlexNet R2 regularizer [38] 87.73 BNN +1 padding [32] 88 BNN 512 channels for FC [38] 88.42 BNN 0 padding [59] 88.47 6 conv.…”
Section: Source Accuracy (%) Topology Precisionmentioning
confidence: 99%
“…CPU-FPGA hybrid systems offer a CPU and FPGA connected in the same silicon. These systems are widely used to implement DNNs and BNNs [35,38,39,41,54,[61][62][63]68,71]. The CPU is flexible and easily programmed to load inputs to the DNN.…”
Section: Architecturesmentioning
confidence: 99%
“…A neural network which uses any combination of binary weight or hard threshold activation functions is typically known as BNN. There have been several successful implementations of BNN algorithms in software [26], [27], [28], [29] and an attempt to implement it on hardware [30], [31], [32]. The analog hardware implementation of BNN system with learning remains as an open problem [33], [32].…”
Section: Background a Learning Algorithms And Biologically Inspimentioning
confidence: 99%
“…Section V contains the circuit and system level simulation results. Section VI discusses advantages and limitations of the proposed circuits and introduces the aspects of the arXiv:1808.10631v1 [cs.ET] 31 Aug 2018 design that should be investigated in future, and Section VII concludes the paper. There is also a supplementary Material that includes the expanded background information, explicit explanation of the proposed circuit, the device parameters of the main backpropagation circuit proposed in Section III and simulation results for the learning circuit performance.…”
Section: Introductionmentioning
confidence: 99%
“…Notably, early approaches for the BNN models in Courbariaux et al (2015Courbariaux et al ( , 2016; Rastegari et al (2016) quantize weights or activations into {+1, −1}, which replaces floating-point multiplications with binary bitwise operations, thus approximating the floating-point multiply-accumulate operation using bitwise XNOR and bit counting operations. Besides, the quantized binary weights can reduce weight storage requirements, which makes BNNs a highly appealing method for implementing CNNs on embedded systems and programmable devices (Guo, 2018;Zhou et al, 2017b;Yi et al, 2018;Liang et al, 2018). Despite many benefits above, the low precision of the binarized operations in BNNs degrades the classification ability on modern CNNs, thus limiting their applications.…”
Section: Introductionmentioning
confidence: 99%