AIP Conference Proceedings 2009
DOI: 10.1063/1.3223914
|View full text |Cite
|
Sign up to set email alerts
|

Energy Relaxation for Hopfield Network With the New Learning Rule

Abstract: Abstract.In this paper, the time for energy relaxation for LittleHopfield neural network using the new activation rule is shown to be better than the relaxation time using Hebbian learning. However, this should be so given the characteristics of the activation function and show through computer simulations that this is indeed so. In this paper, it has been proven that the new learning rule has a higher capacity than Hebb rule by computer simulations. section 3; logic programming on a neural network focused on … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2016
2016
2018
2018

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 8 publications
0
9
0
Order By: Relevance
“…The Hyperbolic tangent activation function is written as follows: function is the broader output space than the linear activation function. The output range is -1 to 1, similar to the conventional sig moid function [6,10]. The scaled output or bounded properties will assist the network to produce good outputs.…”
Section: Hyperbolic Tangent Activation Functionmentioning
confidence: 99%
See 3 more Smart Citations
“…The Hyperbolic tangent activation function is written as follows: function is the broader output space than the linear activation function. The output range is -1 to 1, similar to the conventional sig moid function [6,10]. The scaled output or bounded properties will assist the network to produce good outputs.…”
Section: Hyperbolic Tangent Activation Functionmentioning
confidence: 99%
“…The demarcation fo r unit I's activation, i a are given as follows: h [6,27]. The connection model can be generalized to include higher order connection.…”
Section: A Hopfield Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Compared to neural network which is a black box model, logic program is easier to understand, easier to verify and also easier to change. 6 The assimilation between both paradigm (Logic programming and Hopfield network) was presented by Wan Abdullah and revolve around propositional Horn clauses. 7,8 Gadi Pinkas and Wan Abdullah, 7,9 proposed a bi-directional mapping between logic and energy function in a symmetric neural network.…”
Section: Introductionmentioning
confidence: 99%