2017
DOI: 10.1038/nnano.2017.83
|View full text |Cite
|
Sign up to set email alerts
|

Sparse coding with memristor networks

Abstract: Sparse representation of information provides a powerful means to perform feature extraction on high-dimensional data and is of broad interest for applications in signal processing, computer vision, object recognition and neurobiology. Sparse coding is also believed to be a key mechanism by which biological neural systems can efficiently process a large amount of complex sensory data while consuming very little power. Here, we report the experimental implementation of sparse coding algorithms in a bio-inspired… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
424
0
3

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 549 publications
(427 citation statements)
references
References 28 publications
0
424
0
3
Order By: Relevance
“…This type of behaviour has been recently observed in so-called second-order memristor devices 12,67 and diffusive memristor devices 68 , where the rise and decay of one state variable (for example, local temperature) encodes the relative timing information and can subsequently modulate the Ref. 31 Ref. 18 Ref.…”
Section: The Role Of Chemistry and Biological Detailsmentioning
confidence: 76%
See 1 more Smart Citation
“…This type of behaviour has been recently observed in so-called second-order memristor devices 12,67 and diffusive memristor devices 68 , where the rise and decay of one state variable (for example, local temperature) encodes the relative timing information and can subsequently modulate the Ref. 31 Ref. 18 Ref.…”
Section: The Role Of Chemistry and Biological Detailsmentioning
confidence: 76%
“…For example, memristor hardware performing pattern classification has been demonstrated, initially using a 2 × 10 crossbar 28 and later expanded to a 12 × 12 crossbar 30 . A generic dot-product engine using memristor arrays for neuromorphic applications was introduced in 2016 18 , and a sparse coding chip that allows lateral neuron inhibition was developed using a 32 × 32 crossbar 31 , followed by the demonstration of principal component analysis though online learning in a 9 × 2 crossbar 32 . Large-scale neural networks have also been demonstrated using phase-change memory, following the same principle 33 .…”
Section: Nature Electronicsmentioning
confidence: 99%
“…We performed a simulation of the power consumption for the image compression task with our experimental parameters, including conductance measurements after programming, dissipation by the wire resistances and writing the input patterns, and found the power consumed in the 128 × 64 crossbar array was ~13.7 mW, or an efficiency of ~119.7 effective tera-operations per watt. As an approximate comparison, a highly optimized digital system with an application-specific integrated circuit (ASIC) fabricated at the 40 nm technology node for 4-bit 100-dimensional vector and 4-bit 100 × 200 matrix multiplication, for which the accuracy is comparable with our solution, has a reported energy efficiency of 7.02 × 10 12 operations per second per watt 29 .…”
Section: Nature Electronicsmentioning
confidence: 99%
“…Recently, pulse width, instead of amplitude, was used to represent the analogue input signals [27][28][29][30] , but this scheme requires more readout time and more complicated integrated circuits. Previous experimental demonstrations of an analogue-voltage-amplitude-vector by analogue-conductancematrix product, to the best of our knowledge, have been limited to a 1 × 3 system [24][25][26] , which is not strictly a VMM implementation.…”
mentioning
confidence: 99%
“…[15][16][17][18][19][20][21][22] However, it is important to stress that the machines we propose need feedback to operate as we desire. 5 Therefore, memelements alone are not enough.…”
mentioning
confidence: 99%