2010
DOI: 10.1007/s11432-010-4148-9
|View full text |Cite
|
Sign up to set email alerts
|

Self-excitation of neurons leads to multiperiodicity of discrete-time neural networks with distributed delays

Abstract: In this paper, we investigate the interesting multiperiodicity of discrete-time neural networks with excitatory self-connections and distributed delays. Due to self-excitation of neurons, we construct 2 N close regions in state space for N -dimensional networks and attain the coexistence of 2 N periodic sequence solutions in these close regions. Meanwhile we estimate exponential attracting domain for each periodic sequence solution and apply our results to discrete-time analogues of periodic or autonomous neur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…Time delays are often introduced in neural networks to exhibit chaotic phenomena that can be applied to secure communication [10]. Hence, neural networks with time delays have attracted wide interest, and a large number of results have been reported in recent years, for example [11][12][13][14].…”
Section: Introductionmentioning
confidence: 99%
“…Time delays are often introduced in neural networks to exhibit chaotic phenomena that can be applied to secure communication [10]. Hence, neural networks with time delays have attracted wide interest, and a large number of results have been reported in recent years, for example [11][12][13][14].…”
Section: Introductionmentioning
confidence: 99%
“…Many neural networks are used in the control problem of nonlinear systems [21][22][23][24][25][26][27][28]. For neural networks, the two major categories are feedforward network (FNN) and recurrent network (RNN).…”
Section: Introductionmentioning
confidence: 99%
“…Neural networks are frequently used to construct the sequence memory model. The conventional associative memory model evolves to stable steady state [10][11][12][13][14][15], while neural networks based sequence memory model switches orderly from one pattern to another. This function requires an ability to get out of stable state in a neural network [16].…”
Section: Introductionmentioning
confidence: 99%