2017 IEEE International Symposium on Circuits and Systems (ISCAS) 2017
DOI: 10.1109/iscas.2017.8050530
|View full text |Cite
|
Sign up to set email alerts
|

Pattern representation and recognition with accelerated analog neuromorphic systems

Abstract: Despite being originally inspired by the central nervous system, artificial neural networks have diverged from their biological archetypes as they have been remodeled to fit particular tasks. In this paper, we review several possibilites to reverse map these architectures to biologically more realistic spiking networks with the aim of emulating them on fast, lowpower neuromorphic hardware. Since many of these devices employ analog components, which cannot be perfectly controlled, finding ways to compensate for… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 14 publications
(10 citation statements)
references
References 8 publications
0
10
0
Order By: Relevance
“…In smaller networks, synaptic weight resolution is a critical performance modifier (Petrovici et al, 2017b). However, the penalty imposed by a limited synaptic weight resolution is known to decrease for larger deep networks with more and larger hidden layers, both spiking and non-spiking (Courbariaux et al, 2015; Petrovici et al, 2017a). Furthermore, the successor system (BrainScaleS-2, Aamir et al, 2016) is designed with a 6-bit weight resolution.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In smaller networks, synaptic weight resolution is a critical performance modifier (Petrovici et al, 2017b). However, the penalty imposed by a limited synaptic weight resolution is known to decrease for larger deep networks with more and larger hidden layers, both spiking and non-spiking (Courbariaux et al, 2015; Petrovici et al, 2017a). Furthermore, the successor system (BrainScaleS-2, Aamir et al, 2016) is designed with a 6-bit weight resolution.…”
Section: Discussionmentioning
confidence: 99%
“…Previous small-scale studies of sampling on accelerated mixed-signal neuromorphic hardware include (Petrovici et al, 2015, 2017a,b). An implementation of sampling with spiking neurons and its application to the MNIST dataset was shown in Pedroni et al (2016) using the fully digital, real-time TrueNorth neuromorphic chip (Merolla et al, 2014).…”
Section: Discussionmentioning
confidence: 99%
“…In contrast to the precise but resource hungry frame-based ANN approaches, our network is based on a spike code and therefore potentially very energy efficient if implemented on a neuromorphic hardware platform. Differences in timescale between inputs and hardware can be a problem for neuromorphic systems if they shall operate on natural stimuli in real time and the timescale of a neuromorphic system is often a design choice depending on the potential application (see for instance Qiao et al, 2015 ; Petrovici et al, 2017 for a real time and faster than real time analog neuromorphic hardware framework). Due to the fully event-based nature of our architecture, our network circumvents this problem and is able to operate on any time scales with computations being only driven by external input.…”
Section: Discussion and Outlookmentioning
confidence: 99%
“…Most of these use a spiking, pre-trained approach, i.e., the networks are trained in either ANN or SNN fashion, then in the case of ANNs, converted to SNNs, and implemented on the spiking neuromorphic hardware. Examples include TrueNorth (Esser et al, 2015 , 2016 ), SpiNNaker 1 (Jin et al, 2010 ), the BrainScaleS system (Petrovici et al, 2017 ; Schmitt et al, 2017 ), or the Zurich subthreshold systems (Indiveri et al, 2015 ). Of these, only the last one incorporates some learning, i.e., the last layer of the deep SNN is subject to online supervised learning, with the other layers having pretrained fixed weights.…”
Section: Discussionmentioning
confidence: 99%