2005
DOI: 10.1007/s11063-004-3427-0
|View full text |Cite
|
Sign up to set email alerts
|

Complete Convergence of Competitive Neural Networks with Different Time Scales

Abstract: This paper studies the complete convergence of a class of neural networks with different time scales under the assumption that the activation functions are unsaturated piecewise linear functions. Under this assumption, there are multiple equilibrium points in the neural network. Traditional methods cannot be used in this neural network. Complete convergence is proved by constructing an energy-like function. Simulations are employed to illustrate the theory.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2006
2006
2020
2020

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…If the weights of dynamic neural networks (2) are bounded, A, B, C and D in (10) are chosen such that B is non-singular Hurwitz matrix, A − CB −1 D is Hurwitz, Q P , and Q S are selected such that P and S are the solutions of (18), we choose γ > 0 such that inequality (22) is satisfied, then system (2) is asymptotically stable for 0 ≤ ≤ * , where * = min( * 1 , * 2 ), * 1 and * 2 are given in (23).…”
Section: Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…If the weights of dynamic neural networks (2) are bounded, A, B, C and D in (10) are chosen such that B is non-singular Hurwitz matrix, A − CB −1 D is Hurwitz, Q P , and Q S are selected such that P and S are the solutions of (18), we choose γ > 0 such that inequality (22) is satisfied, then system (2) is asymptotically stable for 0 ≤ ≤ * , where * = min( * 1 , * 2 ), * 1 and * 2 are given in (23).…”
Section: Theoremmentioning
confidence: 99%
“…The complete convergence of different time-scales neural networks is proved in Ref. 22. The global exponential stability of the delayed competitive neural networks with different time scales is discussed in Ref.…”
Section: Introductionmentioning
confidence: 99%
“…It is worth noting that there exist interesting topics about multistability research for muititime‐scale competitive neural networks : The existing ones incorporated global Lipschitz condition into convergent criteria and attained monostability results. Some new multistable dynamics could not be directly revealed by using the existing approaches. Sigmoidal or piecewise linear activation properties are employed in and for multistability analysis of muititime‐scale competitive networks. So, inevitably, some limitations on application for a general class of competitive networks exist. Delay coupled neurons can exhibit complex multistable activation dynamics .…”
Section: Introductionmentioning
confidence: 99%
“…The complete convergence of different time-scales neural networks is proved in Ref. [26]. The global exponential stability of the delayed competitive neural networks with different time scales is discussed in Ref.…”
Section: Introductionmentioning
confidence: 99%