2006
DOI: 10.1016/j.physleta.2005.12.005
|View full text |Cite
|
Sign up to set email alerts
|

New results on stability analysis of neural networks with time-varying delays

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
114
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 185 publications
(118 citation statements)
references
References 14 publications
4
114
0
Order By: Relevance
“…To begin with, no system transformation is performed to the original system; thus, there is no need to overbound the cross-product terms induced by the transformation. In addition, V 4 (x) is included in the Lyapunov functionalṼ (x t ), so that when estimatinġṼ (x t ) we are able to reserve not only the integral − t t−τ (t)ẋ (α) T Zẋ(α) dα but also − t−τ (t) t−τẋ (α) T Zẋ(α) dα, which was often ignored in previous papers [10]. Finally, as the two integrals of˙Ṽ (x t ) are dealt with by using free-weighting matrix methods, the resulting terms τ (t)NZ −1 N T and (τ − τ (t))MZ −1 M T are not so enlarged as in [8], but are kept as they are and handled with convex combination methods.…”
Section: Now Our Elaborate Estimation Of˙ṽ (X T )Induces a Convex Commentioning
confidence: 99%
See 1 more Smart Citation
“…To begin with, no system transformation is performed to the original system; thus, there is no need to overbound the cross-product terms induced by the transformation. In addition, V 4 (x) is included in the Lyapunov functionalṼ (x t ), so that when estimatinġṼ (x t ) we are able to reserve not only the integral − t t−τ (t)ẋ (α) T Zẋ(α) dα but also − t−τ (t) t−τẋ (α) T Zẋ(α) dα, which was often ignored in previous papers [10]. Finally, as the two integrals of˙Ṽ (x t ) are dealt with by using free-weighting matrix methods, the resulting terms τ (t)NZ −1 N T and (τ − τ (t))MZ −1 M T are not so enlarged as in [8], but are kept as they are and handled with convex combination methods.…”
Section: Now Our Elaborate Estimation Of˙ṽ (X T )Induces a Convex Commentioning
confidence: 99%
“…Since these criteria are generally more conservative than delay-dependent ones, especially when the size of the delay is small, delay-dependent stability has been developed with considerable interest. For RNNs with a constant delay, delay-dependent results were obtained in [16,22,23], while delay-dependent criteria were reported for the case of time-varying delays in, e.g., [6,8,10,24]. For RNNs with distributed delays, delay-dependent stability was addressed in [5,13,15].…”
mentioning
confidence: 99%
“…The upper bound of s for various l from Theorem 2 and those in [17] and [18] are listed in Table 2. It is also clear that the obtained upper bounds of s in this brief are better than in [17] and [18], which guarantee the asymptotic stability of neural networks.…”
Section: Numerical Examplesmentioning
confidence: 99%
“…Therefore, the stability of delayed neural networks has aroused considerable interests in recent years, since the dynamic behavior of neural network often involves time delays, which may bring the instability of systems, see e.g. [1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18] and references therein. In these literatures, stability results can be classified into two types: delay-independent stability criteria and delay-dependent stability criteria.…”
Section: Introductionmentioning
confidence: 99%
“…Choosing Lyapunov-Krasovskii functional V (x(t),t) = V 1 (x(t),t) + V 2 (x(t),t) + V 4 (x(t),t), where V 1 (x(t),t),V 2 (x(t),t), and V 4 (x(t),t) are defined in (13). By Assumption 2.2, it is well known that there exist diagonally matrices D 1 0, D 2 0 such that the following inequalities hold…”
Section: )) By Lemma 24 We Havementioning
confidence: 99%