2005
DOI: 10.1016/j.physleta.2005.04.095
|View full text |Cite
|
Sign up to set email alerts
|

Global stability analysis of Cohen–Grossberg neural networks with time varying delays

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
65
0

Year Published

2006
2006
2014
2014

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 156 publications
(69 citation statements)
references
References 16 publications
4
65
0
Order By: Relevance
“…for t > 0, i = 1, 2, · · · , n. Furthermore, model (3) also comprises the following Cohen-Grossberg neural network model with neither impulses nor stochastic effects [13] (4) is also a general neural network that covers the delayed Cohen-Grossberg neural network models studied in [3][4][5][6][7][8][9][10][11][12].…”
Section: Model Description and Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…for t > 0, i = 1, 2, · · · , n. Furthermore, model (3) also comprises the following Cohen-Grossberg neural network model with neither impulses nor stochastic effects [13] (4) is also a general neural network that covers the delayed Cohen-Grossberg neural network models studied in [3][4][5][6][7][8][9][10][11][12].…”
Section: Model Description and Preliminariesmentioning
confidence: 99%
“…In practice, due to the finite speeds of the switching and transmission of signals, time delays do exist in a working network and thus should be incorporated into the model equation [3,26,27]. In recent years, the dynamical behaviors of Cohen-Grossberg neural networks with constant delays or time-varying delays or distributed delays have been studied, see for example [3][4][5][6][7][8][9][10][11][12][13] and the references therein.…”
Section: Introductionmentioning
confidence: 99%
“…It is also important to incorporate time delay in various neural networks. In recent years, there exist some results on global asymptotical stability, global exponential stability and periodic solutions for the neural networks with constant delays or time-varying delays (see, e.g., [1]- [6], [8], [10]- [17], [19]- [24], [29]- [31]). Although the use of finite delays in models with delayed feedback provides a good approximation to simple circuits consisting of a small number of neurons, neural networks usually should have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, assumption (H3) on the behaved functions in our results is the same as that in [5,8,18], and the condition for differentiability imposed on behaved functions in [2][3][4], [6], [7], [9][10][11] is removed in our results. Remark 4.…”
Section: Remarkmentioning
confidence: 91%
“…For simplicity of the presentation, the proof is skipped. In [2][3][4][5][6][7][8][9][10][11][12][13][14][15][16], [19,20,21,23,25,26], the amplification functions were required to satisfy 0 ( )…”
Section: Remarkmentioning
confidence: 99%