2008
DOI: 10.1007/s11633-008-0290-x
|View full text |Cite
|
Sign up to set email alerts
|

Dissipativity analysis of neural networks with time-varying delays

Abstract: A new definition of dissipativity for neural networks is presented in this paper. By constructing proper Lyapunov functionals and using some analytic techniques, sufficient conditions are given to ensure the dissipativity of neural networks with or without time-varying parametric uncertainties and the integro-differential neural networks in terms of linear matrix inequalities. Numerical examples are given to illustrate the effectiveness of the obtained results.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
17
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(17 citation statements)
references
References 19 publications
0
17
0
Order By: Relevance
“…In , either Jensen's inequality or free weighting matrices approach are introduced to deal with integral terms. For instance, in , the authors considered the dissipativity and passivity problem of neural networks by constructing an LKF with double integral terms, which is reduced under Jensen's inequality. But in our paper, the LKF is constructed by using multiple integrals up to quadruple such as tτt[η1T(t,s)Uη1(t,s)+(τt+s)η2T(s)Vη2(s)+(τt+s)2trueẋT(s)W1trueẋ(s)+(τt+s)3trueẋT(s)W2trueẋ(s)]normalds along with the integral term tτ(t)tη1T(t,s)Tη1(t,s)normalds.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…In , either Jensen's inequality or free weighting matrices approach are introduced to deal with integral terms. For instance, in , the authors considered the dissipativity and passivity problem of neural networks by constructing an LKF with double integral terms, which is reduced under Jensen's inequality. But in our paper, the LKF is constructed by using multiple integrals up to quadruple such as tτt[η1T(t,s)Uη1(t,s)+(τt+s)η2T(s)Vη2(s)+(τt+s)2trueẋT(s)W1trueẋ(s)+(τt+s)3trueẋT(s)W2trueẋ(s)]normalds along with the integral term tτ(t)tη1T(t,s)Tη1(t,s)normalds.…”
Section: Resultsmentioning
confidence: 99%
“…There are some remarkable features in Lemma 2.1, which can be highlighted by comparing with some existing results [17,18,26,27]. In [17,18,26,27], either Jensen's inequality or free weighting matrices approach are introduced to deal with integral terms.…”
Section: Remark 36mentioning
confidence: 99%
See 1 more Smart Citation
“…Definition 2.1 [34] Given a scalar γ > 0, real symmetric matrices Q, R, and matrix S, the T-S fuzzy neural networks in (5) is called strictly (Q, R, S)-γ -dissipative, if for any t p ≥ 0, with zero initial condition, the following condition is satisfied:…”
Section: Lemma 23 [26]mentioning
confidence: 99%
“…Considering the fact that time delays in the signal transmission between neurons can cause the oscillatory or even unstable phenomena [4][5][6]20,22,23,29,30], the stability analysis problem for delayed BAM neural networks has received much research interests. Sufficient conditions, either delay-dependent or delay-independent, have been proposed to guarantee the asymptotic or exponential stability for the BAM neural networks, see e.g.…”
mentioning
confidence: 99%