Proceedings of the International Conference on Neuromorphic Systems 2019
DOI: 10.1145/3354265.3354279
|View full text |Cite
|
Sign up to set email alerts
|

Feature Extraction using Spiking Convolutional Neural Networks

Abstract: Spiking neural networks are biologically plausible counterparts of the artificial neural networks, artificial neural networks are usually trained with stochastic gradient descent and spiking neural networks are trained with spike timing dependant plasticity. Training deep convolutional neural networks is a memory and power intensive job. Spiking networks could potentially help in reducing the power usage. There is a large pool of tools for one to chose to train artificial neural networks of any size, on the ot… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 19 publications
(11 citation statements)
references
References 34 publications
0
11
0
Order By: Relevance
“…To monitor the weight updates (synapse changes) in the spiking network, the software provides the capability to monitor spike activity, weight evolution (updates), feature extraction (spikes per map per label), and synapse convergence, etc. This software tool was used here and in [34] [35]. Similar to our work, Mozafari et al released the software tool SPYKETORCH in [43] which is based on the PYTORCH [44] deep learning tool.…”
Section: Software Toolmentioning
confidence: 99%
See 1 more Smart Citation
“…To monitor the weight updates (synapse changes) in the spiking network, the software provides the capability to monitor spike activity, weight evolution (updates), feature extraction (spikes per map per label), and synapse convergence, etc. This software tool was used here and in [34] [35]. Similar to our work, Mozafari et al released the software tool SPYKETORCH in [43] which is based on the PYTORCH [44] deep learning tool.…”
Section: Software Toolmentioning
confidence: 99%
“…Such behavior is expected as the Conv2 layer tries to learn features that are more complex than that of the features in layer L2 (Conv1). For a Conv2 layer trained with spikes collected without lateral inhibition in layer L3 (Pool 1) an early stopping mechanism based on the temporal differences ca be used (see [35]) and an example plot is shown in Figure 24). The net4.spike_statistics() method was used to generate plots shown in Figure 21 and Figure 22.…”
Section: B Conv2 and Pool 2 Without Lateral Inhibition In Poolmentioning
confidence: 99%
“…Deep learning has been used in plethora of applications like autonomous driving, cancer prediction, low power object recognition etc [2] [3] [4]. In particular, neural networks as a regression tool have been used in applications like, time series learning [5], stock prediction [6], pose estimation in computer vision [7], cost predictions [8] etc.…”
Section: Introductionmentioning
confidence: 99%
“…Typically, Spiking Neural Networks (SNNs) are trained using an unsupervised algorithm called Spike Timing Dependant Plasticity (STDP) [5]. Spike features extracted from latency encoded convolutional variants of SNNs have been used with an SVM [5] and a linear neural network classifier [14] to achieve classification accuracies in excess of 98.5%. However, SNNs tend to achieve lower classification accuracies when compared to Artificial Neural Networks (ANNs) [11].…”
Section: Introductionmentioning
confidence: 99%
“…In our previous work [14] we used the MNIST dataset split into two disjoint tasks to show that features extracted from a spiking convolutional network (SCN) demonstrated more immunity to catastrophic forgetting compared to their ANN counterparts. In [14], using early stopping, the first five output neurons were trained to classify the digits {0, 1, 2, 3, 4} and then the remaining five output neurons were trained to classify the digits {5, 6, 7, 8, 9}. The network was then tested on the complete test dataset (digits 0-9) and achieved a 93% accuracy on this test data.…”
Section: Introductionmentioning
confidence: 99%