2015 49th Annual Conference on Information Sciences and Systems (CISS) 2015
DOI: 10.1109/ciss.2015.7086904
|View full text |Cite
|
Sign up to set email alerts
|

FPGA implementation of a Deep Belief Network architecture for character recognition using stochastic computation

Abstract: Deep Neural Networks (DNNs) have proven very effec tive for classification and generative tasks, and are widely adapted in a variety of fields including vision, robotics, speech processing, and more. Specifically, Deep Belief Networks (DBNs), are graphical model constructed of multiple layers of nodes connected as Markov random fields, have been successfully implemented for tackling such tasks. However, because of the numerous connections between nodes in the networks, DBNs suffer a drawback of being computati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
9
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(10 citation statements)
references
References 19 publications
1
9
0
Order By: Relevance
“…12, the ISI max value for N = 64 is 0.5 of the ISI max value for N = 32 and so on. The experimental result follows the trend in (13) that N ISI max is a constant.…”
Section: A Accumulation Of Hazardsupporting
confidence: 73%
See 1 more Smart Citation
“…12, the ISI max value for N = 64 is 0.5 of the ISI max value for N = 32 and so on. The experimental result follows the trend in (13) that N ISI max is a constant.…”
Section: A Accumulation Of Hazardsupporting
confidence: 73%
“…This stochastic spiking network can also be trained on-line by using an event-driven approach of the training method for the RBM [11]. A spiking RBM has been mapped on the neuromorphic TrueNorth system with digital spiking neurons and by using a noisy threshold model to implement the Gibbs sampler [12] and a spiking DBN was implemented on the FPGA using inputs encoded as unary streams [13].…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, different hardware architectures have been presented for the character recognition task. In [ 66 ], Sanni et al presented a hardware implementation of a deep belief network architecture for character recognition using stochastic computation. The authors evaluated their architecture on a Kintex-7 FPGA device for the MNIST database of handwritten digits [ 67 ].…”
Section: Related Workmentioning
confidence: 99%
“…It has the ability of self-learning and self-adaptation. Neural networks include feed-forward neural networks [5], BP neural networks [6], deep belief networks, and convolutional neural networks [7]. Among them, CNN not only has the traditional neural network's strong fault tolerance, self-learning, and adaptive ability but also has the characteristics of automatic extraction characteristics, local connections, and weight sharing.…”
Section: Hand-written Word Recognition Research Status Quomentioning
confidence: 99%