Design, Automation &Amp; Test in Europe Conference &Amp; Exhibition (DATE), 2017 2017
DOI: 10.23919/date.2017.7926981
|View full text |Cite
|
Sign up to set email alerts
|

Approximate computing for spiking neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
3
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 47 publications
(31 citation statements)
references
References 22 publications
0
31
0
Order By: Relevance
“…What is interesting with weight quantization is that the designer can realize a trade-off between the accuracy of the SNN application against energy and area requirements of the neural networks. Approximate computing can also be achieved at the neuron level, where insignificant units are deactivated to reduce computation cost of evaluating SNNs [163].…”
Section: Exploit Fault Tolerance Of Neural Networkmentioning
confidence: 99%
“…What is interesting with weight quantization is that the designer can realize a trade-off between the accuracy of the SNN application against energy and area requirements of the neural networks. Approximate computing can also be achieved at the neuron level, where insignificant units are deactivated to reduce computation cost of evaluating SNNs [163].…”
Section: Exploit Fault Tolerance Of Neural Networkmentioning
confidence: 99%
“…Studies have revealed that the computing accuracy of synapses and neurons influences the final results of computations. Therefore, SNNs can be employed to solve problems for which only approximate answers are required, thereby conserving resources [31]. Scholars have also discovered that the computing costs of neural networks can be reduced using compression and sparse methods [32].…”
Section: Current Development Of Neurocomputing Technologymentioning
confidence: 99%
“…Approximate computing is well known in the area of signal processing and neural network hardware, but has seen limited application to spiking networks. One example is [15], where neurons are progressively trimmed from evaluation as time progresses. Again, our approach is orthogonal to this, and could be used to further reduce computations even for those neurons that are being evaluated.…”
Section: Approximate and Emerging Technologiesmentioning
confidence: 99%