2019
DOI: 10.1109/access.2019.2915271
|View full text |Cite
|
Sign up to set email alerts
|

Study of Recall Time of Associative Memory in a Memristive Hopfield Neural Network

Abstract: By associative memory, people can remember a pattern in microseconds to seconds. In order to emulate human memory, an artificial neural network should also spend a reasonable time in recalling matters of different task difficulties or task familiarities. In this paper, we study the recall time in a memristive Hopfield network (MHN) implemented with memristor-based synapses. With the operating frequencies of 1-100 kHz, patterns can be stored into the network by altering the resistance of the memristors, and the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(13 citation statements)
references
References 34 publications
0
13
0
Order By: Relevance
“…The work of [ 16 ] utilized HNN for transmitting binary amplitude modulated signals based on the potential energy function yielding lower probability of error. In addition, the work of [ 17 ] emphasized HNN as one of the most studied attractor-memory models due to the feature of useful Content Addressable Memory (CAM) for an optimization model. Note that HNN can be split into continuous HNN (CHNN) and discrete HNN (DHNN).…”
Section: Introductionmentioning
confidence: 99%
“…The work of [ 16 ] utilized HNN for transmitting binary amplitude modulated signals based on the potential energy function yielding lower probability of error. In addition, the work of [ 17 ] emphasized HNN as one of the most studied attractor-memory models due to the feature of useful Content Addressable Memory (CAM) for an optimization model. Note that HNN can be split into continuous HNN (CHNN) and discrete HNN (DHNN).…”
Section: Introductionmentioning
confidence: 99%
“…The review found that the use of neural networks to develop computational architectures is oriented toward the design of the networks, followed by learning algorithms to simulate different brain functions in 38.1% [ 41 , 42 , 43 , 44 , 45 , 46 , 47 , 48 ]. Next, the development of brain simulation software is 14.3% [ 49 , 50 , 51 ], and the development of hybrid architectures (using brain computing interfaces supported by neuromorphic processors) accounts for 14.3% [ 52 , 53 , 54 ].…”
Section: Methods and Resultsmentioning
confidence: 99%
“…Artificial neural networks (ANNs) implemented with traditional complementary metal-oxide-semiconductor (CMOS) integrated circuits (ICs), which provide a certain level of intelligence, have been reported. [1][2][3][4] However, constructing CMOS circuits as the synapses and neurons is at the expensive costs of large chip area and power consumption. 5 A critical issue in constructing a large scale artificial neural network is to look for a suitable device structure that can be used to build neurons and synapses.…”
mentioning
confidence: 99%