2022
DOI: 10.3389/fncom.2022.859874
|View full text |Cite
|
Sign up to set email alerts
|

Toward Reflective Spiking Neural Networks Exploiting Memristive Devices

Abstract: The design of modern convolutional artificial neural networks (ANNs) composed of formal neurons copies the architecture of the visual cortex. Signals proceed through a hierarchy, where receptive fields become increasingly more complex and coding sparse. Nowadays, ANNs outperform humans in controlled pattern recognition tasks yet remain far behind in cognition. In part, it happens due to limited knowledge about the higher echelons of the brain hierarchy, where neurons actively generate predictions about what wi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(13 citation statements)
references
References 208 publications
0
13
0
Order By: Relevance
“…However, all these difficulties are technical and can be solved using a neurohybrid approach with memristive devices instead of internetwork axons. Memristors and memristive systems [57], which are implemented in the form of a CMOS-compatible nanostructure with a memory effect, are ideally suited for the role of such connections [44,58]. Recently, the first step in this direction has been taken: commercial memristive devices with the effect of short-term plasticity are used to arrange communication between individual subnets in vitro and provide synchronous activity of target subnets under the control of the source subnet [59].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, all these difficulties are technical and can be solved using a neurohybrid approach with memristive devices instead of internetwork axons. Memristors and memristive systems [57], which are implemented in the form of a CMOS-compatible nanostructure with a memory effect, are ideally suited for the role of such connections [44,58]. Recently, the first step in this direction has been taken: commercial memristive devices with the effect of short-term plasticity are used to arrange communication between individual subnets in vitro and provide synchronous activity of target subnets under the control of the source subnet [59].…”
Section: Discussionmentioning
confidence: 99%
“…Earlier, we formulated the basic principles of associative learning in SNNs [33,43,44]. They are (i) Hebbian learning (using STDP), (ii) synaptic competition or competition of SNN inputs, and (iii) neural competition or competition of SNN outputs.…”
Section: Introductionmentioning
confidence: 99%
“…33 Another way to mitigate the problem of memristive variations is to realize spiking NCSs with bio-inspired algorithms. [34][35][36][37][38] Although there is certain progress in the training of spiking NCSs, such as deep learning-inspired approaches, 39 surrogate gradient learning 40 and Python packages for spiking NCSs modelling like SpikingJelly 41 and SNNTorch, 39 efficient training algorithms for spiking NCSs are still underdeveloped, which complicates the transfer of memristor-based spiking NCSs from the current device level to a large system level. 42 Consequently, the search for new efficient memristor-based NCS architectures and training algorithms is of high interest.…”
Section: Introductionmentioning
confidence: 99%
“…Reflective SNNs can make use of their inherent dynamics to mimic complicated, nonreflexive brain functions, such as the creation of new skills from previously learned ones. SNNs can be implemented as analog computational systems [ 13 ]. In addition to the abovementioned areas of application, SNNs are also used in the areas of cognitive processing.…”
Section: Introductionmentioning
confidence: 99%
“…As mentioned, the subject of SNNs involves many different hardware architectures and many different applications and research areas. SNNs are also the subject of many review articles [ 13 , 35 ]. In this review, we focus on the main learning approaches for SNNs in terms of their efficiency on synchronous digital hardware.…”
Section: Introductionmentioning
confidence: 99%