2020
DOI: 10.1038/s41598-020-70136-5
|View full text |Cite
|
Sign up to set email alerts
|

SpiFoG: an efficient supervised learning algorithm for the network of spiking neurons

Abstract: are mapped into spike times to yield 16 features (4×L). In the dataset, each class has 50 instances. In order to do a better comparison with the state-of-the-art algorithms, we have used the same number of training and testing samples as taken by the state-of-the-art algorithms. Therefore, for training, 50% of 150 instances are used and rest are used to test the performance. The WBC dataset 42 , represents two classes such as Benign (Class 1), and Malignant (Class 2). There are 699 instances with 16 missing va… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 40 publications
0
5
0
Order By: Relevance
“…The authors in 38 present a supervised method of SNN training, which they refer to as SpiFoG. The SpiFoG uses evolutionary optimization to train synapses by introducing random synaptic delays.…”
Section: Related Work and Problem Definitionmentioning
confidence: 99%
“…The authors in 38 present a supervised method of SNN training, which they refer to as SpiFoG. The SpiFoG uses evolutionary optimization to train synapses by introducing random synaptic delays.…”
Section: Related Work and Problem Definitionmentioning
confidence: 99%
“…The complicated dynamics of the biological spiking neuron has posed great difficulty in designing efficient SNNs capable of solving complex information processing problems 72 . Early models has relied on Spike Time Dependent Plasticity (STDP) that depends on pre-synaptic and post-synaptic information 73 , 74 , which results in non-differentiable models 73 , 74 . Surrogate gradients and multiple learning schemes has been proposed to tackle this challenge.…”
Section: Brain-inspired Computing Modelsmentioning
confidence: 99%
“…In addition, research on computational neural networks showed that, for classification problems, SNNs use fewer neurons than the second generation of artificial neural networks (ANNs) does [ 22 ]. In addition, the hardware implementation of SNNs demonstrated their efficacy in modelling conditional reflex formation [ 23 ] or in controlling the contraction of artificial muscles composed of shape memory alloy (SMA).…”
Section: Introductionmentioning
confidence: 99%