2012 IEEE International Symposium on Circuits and Systems 2012
DOI: 10.1109/iscas.2012.6271917
|View full text |Cite
|
Sign up to set email alerts
|

A dual-mode weight storage analog neural network platform for on-chip applications

Abstract: On-chip trainable neural networks show great promise in enabling various desired features of modern integrated circuits (IC), such as Built-In Self-Test (BIST), security and trust monitoring, self-healing, etc. Cost-efficient implementation of these features imposes strict area and power constraints on the circuits dedicated to neural networks, which, however, should not compromise their ability to learn fast and retain functionality throughout their lifecycle. To this end, we have designed and fabricated a re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…Other gradient descent-based optimization methods have also been implemented on neuromorphic systems for training, and they tend to be variations of back-propagation that have been adapted or simplified in some way [639], [645], [709], [716], [718], [719], [723], [792], [812], [844], [1030], [1122], [1300]- [1303]. Back-propagation methods have also been developed in chip-in-the-loop training methods [686], [702], [732], [815], [859]; in this case, most of the learning takes place on a host machine or off-chip, but the evaluation of the solution network is done on the chip. These methods can help to take into account some of the device's characteristics, such as component variation.…”
Section: A Supervised Learningmentioning
confidence: 99%
“…Other gradient descent-based optimization methods have also been implemented on neuromorphic systems for training, and they tend to be variations of back-propagation that have been adapted or simplified in some way [639], [645], [709], [716], [718], [719], [723], [792], [812], [844], [1030], [1122], [1300]- [1303]. Back-propagation methods have also been developed in chip-in-the-loop training methods [686], [702], [732], [815], [859]; in this case, most of the learning takes place on a host machine or off-chip, but the evaluation of the solution network is done on the chip. These methods can help to take into account some of the device's characteristics, such as component variation.…”
Section: A Supervised Learningmentioning
confidence: 99%
“…Artificial neural network (ANN)-based machine learning, known as a promising technology, has been researched widely to enable electronic devices more intelligent and efficient [ 1 , 2 , 3 , 4 ]. ANN is inspired by the brain of living creatures, which contains components such as neurons, connections, weights, and a propagation function.…”
Section: Introductionmentioning
confidence: 99%
“…The key contribution of this paper is the development of a custom neural platform targeting the specific design requirements outlined above. The design represents a major revision of an earlier version [9] with improved circuit-and systemlevel characteristics. The architecture supports two popular learning models, namely the multilayer perceptron (MLP) and the ontogenic neural network (ONN).…”
Section: Introductionmentioning
confidence: 99%