1996
DOI: 10.1016/0925-2312(96)00086-0
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Hopfield networks for associative memories with multi-valued stable states

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
27
0

Year Published

2002
2002
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 45 publications
(27 citation statements)
references
References 4 publications
0
27
0
Order By: Relevance
“…through properly selected or designed unsupervised or semi-supervised methods. Moreover, we plan to extend HoMCat to multilevel functional classes, by exploiting the theory of Generalized Hopfield networks [71] which replaces bi-level activation function with their multilevel counterparts.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…through properly selected or designed unsupervised or semi-supervised methods. Moreover, we plan to extend HoMCat to multilevel functional classes, by exploiting the theory of Generalized Hopfield networks [71] which replaces bi-level activation function with their multilevel counterparts.…”
Section: Discussionmentioning
confidence: 99%
“…Several approaches have been proposed in the literature to extend the HN model, ranging from associative memories [71,56,37], to optimization problems [17,40,12] and system identification tasks [2]. The generalized HN models focused mainly on the output functions of neurons, which have been extended to have multiple inflection points.…”
Section: Hopfield Network With Neurons Partitioned Into Multiple Catmentioning
confidence: 99%
See 1 more Smart Citation
“…Other nonlinear output functions that are bistable (e.g., signum, piecewise linear, etc.,) may increase the performance but they cannot develop real-valued attractors, as reported in [23]. The incorporation of a logistic function may in certain conditions output real values, however, those values must be near 1 and 1.…”
Section: Discussionmentioning
confidence: 99%
“…In this case, we have (see the Appendix) (23) and (24) Equations (23) and (24) show that each of the final weight matrices develops as a function of the nonlinear output function, a cross-correlation matrix and a normalization factor. In addition, each of the weight matrices depends on the value of the other one, thus, making the updates a recurrent nonlinear dynamic process.…”
Section: B Learning Rulementioning
confidence: 99%