2019
DOI: 10.3390/ma12172745
|View full text |Cite
|
Sign up to set email alerts
|

Neuromorphic Spiking Neural Networks and Their Memristor-CMOS Hardware Implementations

Abstract: Inspired by biology, neuromorphic systems have been trying to emulate the human brain for decades, taking advantage of its massive parallelism and sparse information coding. Recently, several large-scale hardware projects have demonstrated the outstanding capabilities of this paradigm for applications related to sensory information processing. These systems allow for the implementation of massive neural networks with millions of neurons and billions of synapses. However, the realization of learning strategies … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
65
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 88 publications
(66 citation statements)
references
References 184 publications
(275 reference statements)
0
65
0
1
Order By: Relevance
“…The following processors (Table 4) have the most mature developer workflows, combined with the widest availability of standalone systems. More details are given in [229], [230].…”
Section: Neuromorphic Computingmentioning
confidence: 99%
“…The following processors (Table 4) have the most mature developer workflows, combined with the widest availability of standalone systems. More details are given in [229], [230].…”
Section: Neuromorphic Computingmentioning
confidence: 99%
“…Each layer needs time to wait to perform the computation until the output of the previous layer is processed; this sequential process can lead to a recognition delay in the neural networks. SNNs have the advantage that they can process data in continuous time by firing spikes that show certain spatiotemporal correlations ( Camuñas-Mesa et al., 2019 ); this is an advantage over sequential processing in DNNs.…”
Section: Neural Networkmentioning
confidence: 99%
“…One main issue is the lack of efficient training algorithms and well-established datasets. For DNNs, the availability of numerous datasets has enabled training of a complex deep learning process based on backpropagation learning algorithms ( Camuñas-Mesa et al., 2019 ). However, the training algorithms for DNNs are not directly used for SNNs ( Tang et al., 2019 ).…”
Section: Neural Networkmentioning
confidence: 99%
“…The choice of neural network on chip for this work is SNN, regarded as the third-generation neural network, due to its biological feats and efficiency in the spatial–temporal signal coding [ 28 ]. In SNN, a web is weaved by the interconnections of neurons and synapses to perform inference and training tasks.…”
Section: Architecture and Designmentioning
confidence: 99%