2004
DOI: 10.1016/s0960-0779(04)00236-x
|View full text |Cite
|
Sign up to set email alerts
|

Global robust stability of delayed recurrent neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
64
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(64 citation statements)
references
References 0 publications
0
64
0
Order By: Relevance
“…However, as discussed in [3,4], in many electronic circuits, the input-output functions of amplifiers may be neither monotonically increasing nor continuously differentiable, hence nonmonotonic functions can be more appropriate to describe the neuron activation in designing and implementing an artificial neural network. In this paper, we make following assumption for the neuron activation functions, where the activation functions no longer need to be differentiable, monotonically increasing and bounded.…”
Section: Problem Formulationmentioning
confidence: 99%
See 1 more Smart Citation
“…However, as discussed in [3,4], in many electronic circuits, the input-output functions of amplifiers may be neither monotonically increasing nor continuously differentiable, hence nonmonotonic functions can be more appropriate to describe the neuron activation in designing and implementing an artificial neural network. In this paper, we make following assumption for the neuron activation functions, where the activation functions no longer need to be differentiable, monotonically increasing and bounded.…”
Section: Problem Formulationmentioning
confidence: 99%
“…[1,12,18]). In recent years, a great number of papers have been published on various neural networks with time delays, and the existence of equilibrium point, global asymptotic stability, global exponential stability, and the existence of periodic solutions have been intensively investigated, see [3,4,10,[15][16][17][19][20][21][22][23][24] for some recent results.…”
Section: Introductionmentioning
confidence: 99%
“…Now, with the parameter ranges given by Equation (7), we have B 2 B 2 + B 2 [4]. Therefore, Equation (46) …”
Section: L N ) Owing To Equation (37) the Second Term In Eqmentioning
confidence: 99%
“…Note that various classes of neural networks such as Hopfield neural networks [17,18], recurrent neural networks [19,20], cellular neural networks [21], Cohen-Grossberg neural networks [22], and bidirectional associative memory (BAM) neural networks [23][24][25] have been widely used in solving some signal processing, optimization, and image processing problems. In the last few years, some researchers have introduced fractional operators to neural networks to form fractional-order neural models [26][27][28][29][30], which could better describe the dynamical behaviors of the neurons.…”
Section: Introductionmentioning
confidence: 99%