2015 IEEE Biomedical Circuits and Systems Conference (BioCAS) 2015
DOI: 10.1109/biocas.2015.7348416
|View full text |Cite
|
Sign up to set email alerts
|

Device mismatch in a neuromorphic system implements random features for regression

Abstract: Abstract-We use a large-scale analog neuromorphic system to encode the hidden-layer activations of a single-layer feed forward network with random weights. The random activations of the network are implemented using the device mismatch inherent to analog circuits. We show that these activations produced by analog VLSI implementations of integrate and fire neurons are suited to solve multi dimensional, non linear regression tasks. Exploitation of the device mismatch eliminates the storage requirements for the r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…Examples of theoretical neural processing frameworks that require variability can be found in the domain of ensemble learning 53 , reservoir computing 54 and liquid state machines 55 . Current efforts in neuromorphic engineering for implementing such frameworks to solve spatio-temporal pattern recognition problems rely on the variability provided by transistor device-mismatch effects [56][57][58][59][60] . Integration of memristive devices with inhomogeneous properties in such architectures can provide a richer set of distributions useful for enhancing the computational abilities of these networks.…”
mentioning
confidence: 99%
“…Examples of theoretical neural processing frameworks that require variability can be found in the domain of ensemble learning 53 , reservoir computing 54 and liquid state machines 55 . Current efforts in neuromorphic engineering for implementing such frameworks to solve spatio-temporal pattern recognition problems rely on the variability provided by transistor device-mismatch effects [56][57][58][59][60] . Integration of memristive devices with inhomogeneous properties in such architectures can provide a richer set of distributions useful for enhancing the computational abilities of these networks.…”
mentioning
confidence: 99%
“…3. The spike-times of each of the lateral, delayed projections were independently drawn from a uniform 248 24, (25, 2), (45, 42), (28,57), (16,50) random distribution ranging from 1-50 ms before the direct forward excitation from A to B2, which defines the reference time (t = 0) of the pattern. All spike-patterns for which a given B2 neuron generated one or more postsynaptic spikes in response were accumulated to approximate the receptive field of that neuron.…”
Section: Mapping Of Receptive Fieldsmentioning
confidence: 99%
“…Learning and recognition of spatiotemporal patterns-in contrast to static spatial patterns, or even sequences of suchis a central conceptual problem to neuromorphic and eventbased processing [11], and has been addressed, for instance, with SNNs that incorporate neural signal-propagation delays [11]- [15]. Some current approaches to spatiotemporal pattern recognition with inhomogeneous neuromorphic hardware use neural processing frameworks that actually rely on variability in the processing elements [13], [16]- [21]-such as reservoir computing, liquid state machines, and ensemble learning. Another relevant concept is that of the Spiking Time-Difference Encoder (sTDE) [22], which provides a general neurocomputational primitive for temporal encoding but requires specialized hardware for its implementation.…”
Section: Introductionmentioning
confidence: 99%
“…In fact, the original neuromorphic definition by Carver Mead referred to analog circuits that operated in subthreshold mode [1]. There are a large variety of other neuromorphic analog implementations [7], [8], [12], [14], [17]- [19], [99], [100], [323], [324], [326]- [329], [331]- [333], [335]- [372], [374]- [387], [516], [571], [572], [578]- [580], [594]- [603], [606]- [610], [612]- [614], [620]- [640], [642]- [646], [648]- [653], [655]- [664], [666]- [683], [685]- [699], [701]- [703], [705]- [719], [722]- …”
Section: A High-levelmentioning
confidence: 99%