2016
DOI: 10.1073/pnas.1604850113
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional networks for fast, energy-efficient neuromorphic computing

Abstract: Deep networks are now able to achieve human-level performance on a broad spectrum of recognition tasks. Independently, neuromorphic computing has now demonstrated unprecedented energy-efficiency through a new chip architecture based on spiking neurons, low precision synapses, and a scalable communication network. Here, we demonstrate that neuromorphic computing, despite its novel architectural primitives, can implement deep convolution networks that (i) approach state-of-the-art classification accuracy across … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

8
591
0
2

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 674 publications
(601 citation statements)
references
References 35 publications
8
591
0
2
Order By: Relevance
“…Driving goals are to carry both the astounding energy efficiency of computations in neural networks of the brain and their learning capability into future generations of electronic hardware. A realization of this dream has now come one step closer, as reported by Esser et al (1). The authors demonstrate that a very energy-efficient implementation of an artificial neural network (i.e., of a circuit that shares properties with networks of neurons in the brain) achieves almost the same performance as humans as shown on eight benchmark datasets for recognizing images and sounds.…”
mentioning
confidence: 63%
See 3 more Smart Citations
“…Driving goals are to carry both the astounding energy efficiency of computations in neural networks of the brain and their learning capability into future generations of electronic hardware. A realization of this dream has now come one step closer, as reported by Esser et al (1). The authors demonstrate that a very energy-efficient implementation of an artificial neural network (i.e., of a circuit that shares properties with networks of neurons in the brain) achieves almost the same performance as humans as shown on eight benchmark datasets for recognizing images and sounds.…”
mentioning
confidence: 63%
“…A learning algorithm like that in Esser et al (1) cannot do that. However, many online learning methods have been developed in machine learning.…”
mentioning
confidence: 95%
See 2 more Smart Citations
“…We call this approach "train-then-constrain", comparing it to the "constrain-then-train" used in [13]. Using the "linear reset" mode, TrueNorth neurons can be configured to produce spiking rates similar to ReLUs.…”
Section: Introductionmentioning
confidence: 99%