2014
DOI: 10.1088/0951-7715/27/2/289
|View full text |Cite
|
Sign up to set email alerts
|

Global dynamics of discrete neural networks allowing non-monotonic activation functions

Abstract: We show that in discrete models of Hopfield type some properties of global stability and chaotic behaviour are coded in the dynamics of a related onedimensional equation. Using this fact, we obtain some new results on stability and chaos for a system of delayed neural networks; some relevant properties of our results are that we do not require monotonicity properties in the activation function, we allow any architecture of the network, and the conclusions are independent of the size of the time delays.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 49 publications
0
3
0
Order By: Relevance
“…In a recent paper [27], Zhuge, Sun, and Lei showed that the ω-limit set for any positive solution of (4.19) is contained in 17) which is exactly the set defined by the relation we derived in (4.16).…”
Section: μ(S)ds N(t A)mentioning
confidence: 85%
See 1 more Smart Citation
“…In a recent paper [27], Zhuge, Sun, and Lei showed that the ω-limit set for any positive solution of (4.19) is contained in 17) which is exactly the set defined by the relation we derived in (4.16).…”
Section: μ(S)ds N(t A)mentioning
confidence: 85%
“…Equation (1.2) can describe the dynamics of a single neuron (or the mean field equation for a population of neurons in a network) with an excitatory and an inhibitory loop, having different delays. The pair of a positive and a negative delayed feedback occurs on the processing of sensory information and other processes in neuroscience (see [5,12,17,19] and references therein).…”
Section: Introductionmentioning
confidence: 99%
“…The ELU is monotonic and approaches a constant value for negative inputs. Theoretically, the monotonic activation function should get better performance [19][20][21], ZHENG et al [22] proposed an improved activation function based on ELU and named FELU, which is also monotonic, while the Swish and Mish are not monotonic and both of them approach zero when increasing the negative inputs, some studies that tend to use non-monotonic activation function [23][24][25][26][27][28]. There are new proposed activation functions named RMAF [29] and REU (PREU) [30], which are similar to Swish but get better accuracy in their experiments and are also nonmonotonic.…”
Section: Introductionmentioning
confidence: 99%