2015 IEEE International Electron Devices Meeting (IEDM) 2015
DOI: 10.1109/iedm.2015.7409718
|View full text |Cite
|
Sign up to set email alerts
|

Scaling-up resistive synaptic arrays for neuro-inspired architecture: Challenges and prospect

Abstract: The crossbar array architecture with resistive synaptic devices is attractive for on-chip implementation of weighted sum and weight update in the neuro-inspired learning algorithms. This paper discusses the design challenges on scaling up the array size due to non-ideal device properties and array parasitics. Circuit-level mitigation strategies have been proposed to minimize the learning accuracy loss in a large array. This paper also discusses the peripheral circuits design considerations for the neuro-inspir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
120
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 167 publications
(125 citation statements)
references
References 9 publications
5
120
0
Order By: Relevance
“…Then, specialization proceeds with one post-neuron specializing for only one character. The success of the learning session demonstrates the robustness of the network against device-to-device variability, in accordance with Yu et al (2015), provided analog behavior holds in each device. Figure 8B shows the weight distribution of the synaptic matrix during training.…”
Section: Discussionsupporting
confidence: 86%
See 1 more Smart Citation
“…Then, specialization proceeds with one post-neuron specializing for only one character. The success of the learning session demonstrates the robustness of the network against device-to-device variability, in accordance with Yu et al (2015), provided analog behavior holds in each device. Figure 8B shows the weight distribution of the synaptic matrix during training.…”
Section: Discussionsupporting
confidence: 86%
“…Some works deal with networks utilizing analog resistance transition in only one direction, either in depression (Yu et al, 2013b) or in potentiation (Eryilmaz et al, 2014). Only few works use analog synapses to simulate neuromorphic networks, as an example Querlioz et al (2013), Yu et al (2015), and Serb et al (2016). The latter, in particular, proposes a network realized in part with real hardware analog memristors and in part with software simulation.…”
Section: Introductionmentioning
confidence: 99%
“…In fact, device runtime stochasticity may be considered a favourable property PersPective Nature electroNics that mimics real biological synapses and can act as a regularizer during training 26 . In addition, practical network operations do not require years of data retention, as in the case of storage systems, and requirement of device endurance may also be relaxed, since weight updates are often infrequent 27 . Mathematically, neuromorphic computations can be decomposed into a series of vector-matrix multiplication operations that are naturally implemented using the memristor crossbar structure.…”
Section: Nature Electronicsmentioning
confidence: 99%
“…We already know that matrix-vector multiplication (or weighted sum) can be easily achieved on RRAM crossbar arrays20, thus, these arrays can be used as synaptic weights of neuromorphic systems. Now we find that the same crossbar architecture can compute squared Euclidean distance as well, which is the most crucial variable in k NN algorithm.…”
Section: Resultsmentioning
confidence: 99%
“…In other words, neuromorphic systems always require massive computing to determine their synaptic weights, which makes them still dependent on von Neumann architecture, or results in extra hardware overhead. In addition, device variations may significantly reduce the recognition accuracy of neuromorphic systems20.…”
mentioning
confidence: 99%