2019
DOI: 10.1063/1.5129729
|View full text |Cite
|
Sign up to set email alerts
|

Spike time dependent plasticity (STDP) enabled learning in spiking neural networks using domain wall based synapses and neurons

Abstract: We have implemented a Spiking Neural Network (SNN) architecture using a combination of spin orbit torque driven domain wall devices and transistor based peripheral circuits as both synapses and neurons. Learning in the SNN hardware is achieved both under completely unsupervised mode and partially supervised mode through mechanisms, incorporated in our spintronic synapses and neurons, that have biological plausibility, e.g., Spike Time Dependent Plasticity (STDP) and homoeostasis. High classification accuracy i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
20
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(21 citation statements)
references
References 39 publications
1
20
0
Order By: Relevance
“…However the partially supervised nature comes from applying bias currents to the output stage neurons or post-neurons only during the learning phase to control them and make them fire selectively (Fig. 1(b)) [20], [24], [41], [42]. While SNN has been trained on simple datasets like the Fisher's Iris dataset and Wisconsin Breast Cancer dataset under the partially supervised mode in the previous reports [20], [24], [41], [42], training on a more complex dataset involving more input features and more output classes like the MNIST dataset has not been demonstrated before whether with spintronic hardware or otherwise, to the best of our knowledge.…”
Section: Work Done In This Papermentioning
confidence: 99%
See 4 more Smart Citations
“…However the partially supervised nature comes from applying bias currents to the output stage neurons or post-neurons only during the learning phase to control them and make them fire selectively (Fig. 1(b)) [20], [24], [41], [42]. While SNN has been trained on simple datasets like the Fisher's Iris dataset and Wisconsin Breast Cancer dataset under the partially supervised mode in the previous reports [20], [24], [41], [42], training on a more complex dataset involving more input features and more output classes like the MNIST dataset has not been demonstrated before whether with spintronic hardware or otherwise, to the best of our knowledge.…”
Section: Work Done In This Papermentioning
confidence: 99%
“…3. Earlier works [24], [43] only report the energy consumed in the spintronic devices themselves for the STDP enabled learning. In this paper, we have reported the net energy consumed in the peripheral transistor circuits as well.…”
Section: Work Done In This Papermentioning
confidence: 99%
See 3 more Smart Citations