2021
DOI: 10.3389/fnins.2021.580909
|View full text |Cite
|
Sign up to set email alerts
|

Non-linear Memristive Synaptic Dynamics for Efficient Unsupervised Learning in Spiking Neural Networks

Abstract: Spiking neural networks (SNNs) are a computational tool in which the information is coded into spikes, as in some parts of the brain, differently from conventional neural networks (NNs) that compute over real-numbers. Therefore, SNNs can implement intelligent information extraction in real-time at the edge of data acquisition and correspond to a complementary solution to conventional NNs working for cloud-computing. Both NN classes face hardware constraints due to limited computing parallelism and separation o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
32
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 27 publications
(33 citation statements)
references
References 64 publications
1
32
0
Order By: Relevance
“…They have been proposed to implement electronic synapses in hardware neural networks, due to the ability to adapt their strength (conductance) in an analogue fashion as a function of incoming electrical pulses (synaptic plasticity), leading to long-term (short-term) potentiation and depression. In addition, learning rules such as spike-time or spike-rate dependent plasticity, paired-pulse facilitation or the voltage threshold-based plasticity have been demonstrated; the stochasticity of the switching process has been exploited for stochastic update rules [67][68][69]. Most of the VCM devices are based on a two-terminal configuration, and the switching geometry involves either confined filamentary, or interfacial regions (figure 5(A)).…”
Section: Statusmentioning
confidence: 99%
See 1 more Smart Citation
“…They have been proposed to implement electronic synapses in hardware neural networks, due to the ability to adapt their strength (conductance) in an analogue fashion as a function of incoming electrical pulses (synaptic plasticity), leading to long-term (short-term) potentiation and depression. In addition, learning rules such as spike-time or spike-rate dependent plasticity, paired-pulse facilitation or the voltage threshold-based plasticity have been demonstrated; the stochasticity of the switching process has been exploited for stochastic update rules [67][68][69]. Most of the VCM devices are based on a two-terminal configuration, and the switching geometry involves either confined filamentary, or interfacial regions (figure 5(A)).…”
Section: Statusmentioning
confidence: 99%
“…VCM devices have been developed in the last 15 years mainly for storage applications, but for neuromorphic applications the required properties differ. In general, desirable properties of memories for neural networks include (i) analogue behaviour or controllable multilevel states, (ii) compatibility with learning rules supporting also online learning, (iii) tuneable short-term and long-term stability of the weights to implement various dynamics and timescales in synaptic and neuronal circuits [67][68][69]. A significant debate still refers to the linear/non-linear and symmetric/asymmetric conductance update of experimental devices, synaptic resolution (number of resistance levels), and how to exploit or mitigate these features (figures 5(B) and (C)).…”
Section: Current and Future Challengesmentioning
confidence: 99%
“…Weight change in standard STDP models generally depends not only on but also on the present weight value of the target synapse 9 , 13 , 31 36 . This is originated from the fact that the dynamic range of a synaptic weight is not unlimited and has its upper and lower bounds.…”
Section: Introductionmentioning
confidence: 99%
“…Accordingly, the update asymmetry of devices is more likely to affect the process of training neural networks (Kwon et al, 2020 ). Inspired by this motivation, they demonstrated that asymmetric non-linear devices are more powerful than symmetric linear devices (Brivio et al, 2018 , 2021 ; Kim et al, 2021 ). Our analytical tools and calibration method can be applied to illustrate the relationship between the update asymmetry of devices and its impact on training in detail.…”
Section: Impact Of Update Asymmetries On Neural Network Performancementioning
confidence: 99%