2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489673
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Learning with Self-Organizing Spiking Neural Networks

Abstract: We present a system comprising a hybridization of self-organized map (SOM) properties with spiking neural networks (SNNs) that retain many of the features of SOMs. Networks are trained in an unsupervised manner to learn a self-organized lattice of filters via excitatory-inhibitory interactions among populations of neurons. We develop and test various inhibition strategies, such as growing with inter-neuron distance and two distinct levels of inhibition. The quality of the unsupervised learning algorithm is eva… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
62
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 66 publications
(62 citation statements)
references
References 17 publications
0
62
0
Order By: Relevance
“…This section summarizes the applied spiking neural network structure and learning algorithms using leaky integrate-and-fire (LIF) neurons and spike-timing-dependent plasticity (STDP), as initially outlined in [14], extended in [25,32], and supplemented in [33] with inter-neuron distance-dependent inhibition strength.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…This section summarizes the applied spiking neural network structure and learning algorithms using leaky integrate-and-fire (LIF) neurons and spike-timing-dependent plasticity (STDP), as initially outlined in [14], extended in [25,32], and supplemented in [33] with inter-neuron distance-dependent inhibition strength.…”
Section: Methodsmentioning
confidence: 99%
“…We demonstrate increased learning speed on machine learning tasks due to these architectural change. Next, we describe the PyTorch-based BindsNET 1 simulator [32,33], where we were able to replicate previous results using the BRIAN library and implement new experiments based on the greater flexibility of BindsNET. The employed testbeds include images from MNIST dataset [34] and images obtained by the Atari Breakout game.…”
Section: Outline Of the Present Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Some algorithms are based on supervised learning techniques, such as the nonrecurrent SNN developed in [6] that extracts the spike features from speech signals, and the framework for sound recognition presented in [7]. Unsupervised learning is exploited in other approaches for both feature extraction [8] and isolated word classification utilizing the principles of Self-Organizing Maps [9,10]. Some studies use instead a reservoir based-technique to solve the task of isolated digit recognition [11,12].…”
Section: Introductionmentioning
confidence: 99%