2022
DOI: 10.1016/j.matcom.2021.10.012
|View full text |Cite
|
Sign up to set email alerts
|

Global exponential synchronization of high-order quaternion Hopfield neural networks with unbounded distributed delays and time-varying discrete delays

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

3
4

Authors

Journals

citations
Cited by 49 publications
(13 citation statements)
references
References 36 publications
0
13
0
Order By: Relevance
“…The purpose of this paper is to design a state observer (3) for DTBAMNN (1) via the measurements (2), that is, determine observer gains R := [r ij ] ∈ R n×m1 and R := [r ij ] ∈ R n×m2 guaranteeing GES of the error system (4). Lemma 1: [37] For a Metzler matrix A 0 ∈ R n×n and matrices B 0 , C 0 , D 0 ∈ R n×n , the items (a)-(c) are equivalent:…”
Section: Problem Description and Preliminary Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…The purpose of this paper is to design a state observer (3) for DTBAMNN (1) via the measurements (2), that is, determine observer gains R := [r ij ] ∈ R n×m1 and R := [r ij ] ∈ R n×m2 guaranteeing GES of the error system (4). Lemma 1: [37] For a Metzler matrix A 0 ∈ R n×n and matrices B 0 , C 0 , D 0 ∈ R n×n , the items (a)-(c) are equivalent:…”
Section: Problem Description and Preliminary Resultsmentioning
confidence: 99%
“…The method directly uses the generalized matrix inverses and the definitions of GES, and it avoids the construction of any LKF; (2) The obtained sufficient conditions are composed of linear scalars inequalities that is easy to solve; (3) It is suitable for the more general neural network models after a small modification. For example, memristor-based NNs [40], inertial neural works [41] and high-order NNs [4], [42], [43].…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…In the studies about global stability of neural networks models, discrete and continuous, it is usually assumed that the activation functions, f j , are Lipschitz [2,6,7,10,12,25,37]. Here we do not assume that f j are Lipschitz and hypothesis (H1) only implies the continuity of f j at u = 0.…”
Section: Remarkmentioning
confidence: 99%