2015
DOI: 10.1016/j.neucom.2014.10.091
|View full text |Cite
|
Sign up to set email alerts
|

A discrete gradient method to enhance the numerical behaviour of Hopfield networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(12 citation statements)
references
References 32 publications
0
12
0
Order By: Relevance
“…The system described by (1) is indeed a dynamic system, confirming the existence of an energy function (or Lyapunov function) which decreases through system trajectories [19]. The energy function of the HNN described in (1) can be defined as follows [18]:…”
Section: A Hnnmentioning
confidence: 75%
See 1 more Smart Citation
“…The system described by (1) is indeed a dynamic system, confirming the existence of an energy function (or Lyapunov function) which decreases through system trajectories [19]. The energy function of the HNN described in (1) can be defined as follows [18]:…”
Section: A Hnnmentioning
confidence: 75%
“…The key to utilizing a continuous HNN as an optimization method is matching the objective function to the energy function and then constructing the HNN accordingly [19].…”
Section: A Hnnmentioning
confidence: 99%
“…Celledoni et al [13] extended the Itoh-Abe discrete gradient method to optimisation problems defined on Riemannian manifolds. Hernández-Solano et al [29] combined a discrete gradient method with Hopfield networks in order to preserve a Lyapunov function for optimisation problems.…”
Section: Related Literaturementioning
confidence: 99%
“…The original HNN can only deal with the discrete binary pattern recognition by using Hebb's rule [17] and its memory capacity is limited to the network size [18]. However, in recent years lots of works [19][20][21][22][23][24][25][26] have studied the memory capacity and invented different kinds of continuous HNN to deal with the continuous value pattern. We leverage HNN's advantage in warping pattern recognition and the segmentation method in our work, proposing a training-based pattern matching approach, which only needs to be trained on the predefined template pattern.…”
Section: Introductionmentioning
confidence: 99%