2023
DOI: 10.3390/s23063037
|View full text |Cite
|
Sign up to set email alerts
|

Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities

Abstract: Spiking neural networks (SNNs) are subjects of a topic that is gaining more and more interest nowadays. They more closely resemble actual neural networks in the brain than their second-generation counterparts, artificial neural networks (ANNs). SNNs have the potential to be more energy efficient than ANNs on event-driven neuromorphic hardware. This can yield drastic maintenance cost reduction for neural network models, as the energy consumption would be much lower in comparison to regular deep learning models … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(3 citation statements)
references
References 63 publications
0
3
0
Order By: Relevance
“…Their computational power surpasses the abilities of ANN in that they can process spike trains over time decoding temporal information. Moreover, implementation of SNNs even on large scales is not difficult ( Cessac et al, 2010 ; Pietrzak et al, 2023 ).…”
Section: Information Processing In Brain: Theoretical Conceptsmentioning
confidence: 99%
“…Their computational power surpasses the abilities of ANN in that they can process spike trains over time decoding temporal information. Moreover, implementation of SNNs even on large scales is not difficult ( Cessac et al, 2010 ; Pietrzak et al, 2023 ).…”
Section: Information Processing In Brain: Theoretical Conceptsmentioning
confidence: 99%
“…Due to the use of discrete events in processing, SNN compute a single response across multiple time steps, making them less efficient on standard synchronous computer hardware but potentially more effective on specialized neuromorphic hardware [ 17 ]. This specialized hardware, comprising asynchronous and event-driven circuits, guides the design of building blocks for hardware solutions, particularly advantageous for robotic platforms [ 18 ].…”
Section: Introductionmentioning
confidence: 99%
“…The development of AI is growing fast due to the availability of vast amounts of data, making it easy to train machines to learn and make decisions (Duan et al, 2019). The exponential growth of computing power also characterises the growth in AI; thus, developing specialised hardware such as graphics processing units (GPUs) and tensor processing units (TPUs) has accelerated AI training and inference process (Khan et al, 2021;Cowls et al, 2021;Pietrzak et al, 2023). Today, AI is emerging as a transformative force with its applications spanning various industries, such as healthcare and medical surgery (Hashimoto et al, 2020;Gamble, 2020).…”
Section: Introductionmentioning
confidence: 99%