2021
DOI: 10.3389/fncom.2021.665662
|View full text |Cite
|
Sign up to set email alerts
|

Editorial: Understanding and Bridging the Gap Between Neuromorphic Computing and Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 22 publications
0
10
0
Order By: Relevance
“…The artificial synapses must have nearly zero nonlinearity and asymmetricity in conductance modulation for neuromorphic applications. There are various types of ML models of neural networks, including ANN, deep neural networks, and convolutional neural networks in neuromorphic computing . These models mimic the strengthening and weakening of the connection between pre- and postsynaptic neurons by the sum of synaptic weight.…”
Section: Resultsmentioning
confidence: 99%
“…The artificial synapses must have nearly zero nonlinearity and asymmetricity in conductance modulation for neuromorphic applications. There are various types of ML models of neural networks, including ANN, deep neural networks, and convolutional neural networks in neuromorphic computing . These models mimic the strengthening and weakening of the connection between pre- and postsynaptic neurons by the sum of synaptic weight.…”
Section: Resultsmentioning
confidence: 99%
“…Through this direct method, ANN-converted SNNs typically need thousands of time steps to represent the information encoded in the spike train for completing a single inference, in addition to the drop in accuracy caused by imposing constraints on the source ANN (Rathi et al 2020;Lee et al 2020a). This is quite a large breakdown from the other SNNs, leading to significant latency and energy consumption opposite to the original purpose (Singh et al 2020;Deng, Tang, and Roy 2021;Lee et al 2020b;Zhang et al 2020).…”
Section: Background and Related Workmentioning
confidence: 99%
“…Deep neural networks have made tremendous progress and become a prevalent tool for performing various cognitive tasks such as object recognition (Simonyan and Zisserman, 2015;Sandler et al, 2018), natural language processing (Devlin et al, 2018;Radford et al, 2019), and self-driving (Nedevschi et al, 2012;Liu et al, 2017), etc. To leverage the capability of deep neural networks in ubiquitous environments requires deployment not only on large-scale computers but also on portable edge devices Deng et al, 2021). However, the increasing complexity of deep neural networks, coupled with data flooding with distributed sensors continuously generates real-time content and places tremendous energy demands on current computing platforms.…”
Section: Introductionmentioning
confidence: 99%
“…To date, shallow SNN structures (i.e., two fully connected layers) have been widely used for classification. However, training highperformance SNNs with competitive classification accuracy and less latency is a nontrivial problem, limiting their scalability in complex applications (Benjamin et al, 2014;Roy et al, 2019;Sengupta et al, 2019;Comsa et al, 2020;Deng et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation