2017
DOI: 10.1016/j.neucom.2017.01.088
|View full text |Cite
|
Sign up to set email alerts
|

A spiking network that learns to extract spike signatures from speech signals

Abstract: Spiking neural networks (SNNs) with adaptive synapses reflect core properties of biological neural networks. Speech recognition, as an application involving audio coding and dynamic learning, provides a good test problem to study SNN functionality. We present a simple, novel, and efficient nonrecurrent SNN that learns to convert a speech signal into a spike train signature. The signature is distinguishable from signatures for other speech signals representing different words, thereby enabling digit recognition… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 48 publications
(23 citation statements)
references
References 32 publications
0
23
0
Order By: Relevance
“…In this approach, firing the target neuron causes STDP over the incoming synapses and firing the non-target neurons causes anti-STDP. This approach has successfully been used in SNNs for numerical data classification [120], handwritten digit recognition [121], spoken digit classification [83], and reinforcement learning in SNNs [122]. The sharp synaptic weight adaptation based on immediate STDP and anti-STDP results in fast learning.…”
Section: B Learning Rules In Snnsmentioning
confidence: 99%
“…In this approach, firing the target neuron causes STDP over the incoming synapses and firing the non-target neurons causes anti-STDP. This approach has successfully been used in SNNs for numerical data classification [120], handwritten digit recognition [121], spoken digit classification [83], and reinforcement learning in SNNs [122]. The sharp synaptic weight adaptation based on immediate STDP and anti-STDP results in fast learning.…”
Section: B Learning Rules In Snnsmentioning
confidence: 99%
“…For a system matrix A of size N ×N in a discrete state space system which defines the time dynamics, we get its diagonal entries in vector a. From this, we propose the memory metric τ M for an approximate model of the reservoir (13) to be (22). In our case, discrete time step h is 1ms.…”
Section: B Concept Of Memorymentioning
confidence: 99%
“…Performance of the LSM for various synaptic weight scaling is found ( Fig. 4 (b)) and the memory metric τ M of the corresponding state space modelled system is found using (22), and its variation against synaptic weight scaling is shown in Fig. 4 (c).…”
Section: B Performance As a Function Of Network Activitymentioning
confidence: 99%
“…These models present single layer supervised learning. However, the idea of spike/no-spike classification broadens a new supervised learning category incorporating efficient spike-timing-dependent plasticity (STDP) [28,29,30] and anti-STDP that are triggered according to the neuron's label [31,32]. STDP is a biologically plausible learning rule occurs in the brain in which the presynaptic spikes occur immediately before the current postsynaptic spike strengthen the interconnecting synapses (LTP); otherwise, the synapses are weakened (LTD).…”
Section: Introductionmentioning
confidence: 99%