1997
DOI: 10.1109/72.641450
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent neural nets as dynamical Boolean systems with application to associative memory

Abstract: Discrete-time/discrete-state recurrent neural networks are analyzed from a dynamical Boolean systems point of view in order to devise new analytic and design methods for the class of both single and multilayer recurrent artificial neural networks. With the proposed dynamical Boolean systems analysis, we are able to formulate necessary and sufficient conditions for network stability which are more general than the well-known but restrictive conditions for the class of single layer networks: (1) symmetric weight… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(20 citation statements)
references
References 16 publications
0
20
0
Order By: Relevance
“…The dynamics of rather simple Boolean recurrent neural networks can implement an associative memory with bioinspired features [71], [72]. In the Hopfield framework, stable equilibria of the network that do not represent any valid configuration of the optimisation problem are referred to as spurious attractors .…”
Section: Methodsmentioning
confidence: 99%
“…The dynamics of rather simple Boolean recurrent neural networks can implement an associative memory with bioinspired features [71], [72]. In the Hopfield framework, stable equilibria of the network that do not represent any valid configuration of the optimisation problem are referred to as spurious attractors .…”
Section: Methodsmentioning
confidence: 99%
“…Now, we will present a solution to ensure the non-fragile finite-time l 2 − l ∞ state estimator design problem for the augmented estimation error system (˜ ) based on Theorem 2. MSFTB with respect to (a 1 , a 2 , N, R, W) and satisfies a prescribed (11) i˜ (12) i˜ (13)(14) i * ˜ (22)(23) i˜ (24) …”
Section: Theorem 2 Given Scalarsmentioning
confidence: 99%
“…By the Schur complement, it follows from (38) that i = ⎡ ⎣ (11) i (12) i¯ (13) i * (22)(23) i * * ¯ (33)…”
Section: Theorem 2 Given Scalarsmentioning
confidence: 99%
See 1 more Smart Citation
“…Neural networks (NNs) have received considerable attention for their potential applications in many areas, such as image processing, pattern recognition, associative memories, solving certain optimization problems, and so on [1,2,3]. Since the switching speed of information processing and the inherent neuron communication is finite, time delays are always unavoidably existent in neural networks, and their existence may lead to instability or significantly deteriorated performances.…”
Section: Introductionmentioning
confidence: 99%