2018
DOI: 10.18514/mmn.2018.1175
|View full text |Cite
|
Sign up to set email alerts
|

Neural networks with distributed delays and Hölder continuous activation functions

Abstract: We consider a system which arises in Neural Network Theory with distributed delays involving Hölder continuous activation functions. We prove some results on global exponential stability of the system. This extends the previous works where activation functions were assumed to be Lipschitz continuous.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…Observe that, the Hölder continuous functions (having an exponent between 0 and 1) cannot be covered by our results. These were studied in [7,[27][28][29][30][31][32][33]. It is known that when the exponents exceed one, the functions are constant.…”
Section: Numerical Illustrationmentioning
confidence: 99%
See 1 more Smart Citation
“…Observe that, the Hölder continuous functions (having an exponent between 0 and 1) cannot be covered by our results. These were studied in [7,[27][28][29][30][31][32][33]. It is known that when the exponents exceed one, the functions are constant.…”
Section: Numerical Illustrationmentioning
confidence: 99%
“…In view of the importance of non-Lipschitz activation functions in implementations [15], a relaxation of the Lipschitz condition is necessary. This has motivated some researchers to consider discontinuous functions and Hölder-type functions, one can refer to [2,3,7,11,[27][28][29][30][31][32][33]35].…”
Section: Introductionmentioning
confidence: 99%
“…Such cases arise naturally in applications. Some efforts have been made in this respect such as in Tatar and Wu et al [14][15][16][17] and Forti et al and Tatar 8,14,[18][19][20][21][22] through the treatment of Hölder continuous activation functions. One of the difficulties encountered is lost of the global stability due to the use of some integral inequalities.…”
Section: Introductionmentioning
confidence: 99%
“…Namely, in, 28 the activation functions, of a neural network system without delays, are supposed to be merely bounded by non-decreasing functions. The Hölder continuity has been assumed in , 29 for problems without delays, in 30 for discrete delays and in 31 for distributed delays. The Hölder continuity of f j and g j is assumed in, 32 for the problem…”
Section: Introductionmentioning
confidence: 99%