2003
DOI: 10.1109/tnn.2003.810594
|View full text |Cite
|
Sign up to set email alerts
|

Global exponential stability of competitive neural networks with different time scales

Abstract: The dynamics of cortical cognitive maps developed by self-organization must include the aspects of long and short-term memory. The behavior of such a neural network is characterized by an equation of neural activity as a fast phenomenon and an equation of synaptic modification as a slow part of the neural system. We present a new method of analyzing the dynamics of a biological relevant system with different time scales based on the theory of flow invariance. We are able to show the conditions under which the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

3
53
0

Year Published

2005
2005
2018
2018

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 115 publications
(56 citation statements)
references
References 9 publications
3
53
0
Order By: Relevance
“…All of these conditions are about the stability of neural networks. The stability of the neural networks with different time-scales are key problems of patterns storage in the short-term and the long-term memories [19] and solving optimization problem [16]. For example, we can design a neural network as in (3) to store the short-term and long-term patterns, when the weights satisfy the condition (10), this neural network can work.…”
Section: Will Make the Dynamic Neural Network (3) Inputto-state Stabmentioning
confidence: 99%
“…All of these conditions are about the stability of neural networks. The stability of the neural networks with different time-scales are key problems of patterns storage in the short-term and the long-term memories [19] and solving optimization problem [16]. For example, we can design a neural network as in (3) to store the short-term and long-term patterns, when the weights satisfy the condition (10), this neural network can work.…”
Section: Will Make the Dynamic Neural Network (3) Inputto-state Stabmentioning
confidence: 99%
“…This shows that (t) is a monotone decreasing function. By the assumption that the solution of network (1) is bounded, it follows that the solutions of (5), (8) Proof. Using Lemma 3, from (7) and (9), it follows that lim t→+∞ x (t), S (t) = 0.…”
Section: Complete Convergencementioning
confidence: 99%
“…This class of neural networks has attracted extensive interest, see for examples [1,3,[7][8][9] and the references in them. Many of these studies are around single equilibrium point of the networks.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, Meyer-Bäse et al [13][14][15] proposed the so-called competitive neural networks with different time scales. In the competitive neural networks model, there are two types of state variables: the short-term memory (STM) variable describing the fast neural activity, and the long-term memory (LTM) variable describing the slow unsupervised synaptic modifications.…”
mentioning
confidence: 99%