1991
DOI: 10.1162/neco.1991.3.2.282
|View full text |Cite
|
Sign up to set email alerts
|

Symmetric Neural Networks and Propositional Logic Satisfiability

Abstract: St. LOUJS, MO 63230 U S AConnectionist networks with symmetric weights (like Hopfield networks and Boltzmann Machines) use gradient descent to find a minimum for quadratic energy functions. We show an equivalence between the problem of satisfiability in propositional calculus and the problem of minimizing those energy functions. The equivalence is in the sense that for any satisfiable well formed formula (WFF) we can find a quadratic function that describes it, such that the set of solutions that minimizes the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
46
0

Year Published

1992
1992
2018
2018

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 66 publications
(46 citation statements)
references
References 6 publications
0
46
0
Order By: Relevance
“…Thus, the energy value is vital to separate local minimum and global minimum solution. Global minimum energy supposed to be can be pre-calculated because the total magnitude of the energy that corresponds to MAX-kSAT clauses is always constant (Pinkas 1991;Wan Abdullah 1993). The retrieval power of HNN always depends on how the synaptic weights are computed.…”
Section: Logic Programming In Discrete Hopfield Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, the energy value is vital to separate local minimum and global minimum solution. Global minimum energy supposed to be can be pre-calculated because the total magnitude of the energy that corresponds to MAX-kSAT clauses is always constant (Pinkas 1991;Wan Abdullah 1993). The retrieval power of HNN always depends on how the synaptic weights are computed.…”
Section: Logic Programming In Discrete Hopfield Neural Networkmentioning
confidence: 99%
“…Thus, the logic programming can be interpreted as a problem in combinatorial optimization standpoint. Pinkas (1991) expanded the idea of logic program by integrating the competent propositional knowledge or logical mapping system via symmetric connectionist network. Hence, the proposed symmetric connectionist network (SCN) has attracted researchers to revive many domains of artificial neural network such as Hopfield Neural Network, Boltzmann Machine, Harmony Theory and Mean Field Theory.…”
Section: Introductionmentioning
confidence: 99%
“…We use neurons to store the truth value of atoms to write a cost function for minimization when all the clauses are satisfied. In addition, a bi-directional mapping between propositional logic formulas and energy functions of symmetric neural networks had defined by Gadi Pinkas [6,7] and Wan Abdullah [4]. We show below how logic programming can be interpreted as a combinatorial optimization problem and implemented on a neural network.…”
Section: Higher Order Logic Programmingmentioning
confidence: 99%
“…In addition, the optimization of logical inconsistency is carried out by the network after the connection strengths are defined from the higher order logic program [5]. Consequently, Pinkas [6,7] emphasized on a bi-directional mapping between the prepositional logic formulas and Hopfield network by integrating the energy minimization scheme to achieve global convergence. In this paper, we will ponder and analyze the performance of doing higher order logic programming in Hopfield network.…”
Section: Introductionmentioning
confidence: 99%
“…Basically, logic programming can be treated as a problem in combinatorial optimization standpoint [3,7]. Therefore, it can be carried out in a neural network to obtain desired solutions.…”
Section: Introductionmentioning
confidence: 99%