2003
DOI: 10.1109/tnn.2003.813844
|View full text |Cite
|
Sign up to set email alerts
|

A new design method for the complex-valued multistate hopfield associative memory

Abstract: A method to store each element of an integral memory set M subset {1,2,...,K}/sup n/ as a fixed point into a complex-valued multistate Hopfield network is introduced. The method employs a set of inequalities to render each memory pattern as a strict local minimum of a quadratic energy landscape. Based on the solution of this system, it gives a recurrent network of n multistate neurons with complex and symmetric synaptic weights, which operates on the finite state space {1,2,...,K}/sup n/ to minimize this quadr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
66
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 208 publications
(66 citation statements)
references
References 16 publications
0
66
0
Order By: Relevance
“…Complex-valued neurons have often been utilized to represent multilevel information. For example, CHNNs have been applied to the storage of gray-scale images [15–17]. Given a stored pattern with noise, the CHNN cancels the noise and outputs the original pattern.…”
Section: Introductionmentioning
confidence: 99%
“…Complex-valued neurons have often been utilized to represent multilevel information. For example, CHNNs have been applied to the storage of gray-scale images [15–17]. Given a stored pattern with noise, the CHNN cancels the noise and outputs the original pattern.…”
Section: Introductionmentioning
confidence: 99%
“…Hopfield neural networks (HNNs) have also been extended. Complex‐valued HNNs (CHNNs) are multistate HNN models, and have often been applied to the storage of images . Moreover, several quaternionic HNNs (QHNNs), which are extensions using quaternions, have been proposed .…”
Section: Introductionmentioning
confidence: 99%
“…This rule is not practical because of its extremely low storage capacity . Several advanced learning algorithms, such as the projection learning rule, pseudo‐relaxation learning algorithm, and gradient descent learning rule (GDLR), have been proposed . Projection learning rule has a severe restriction in that the connections must be full .…”
Section: Introductionmentioning
confidence: 99%