2018
DOI: 10.1016/j.neucom.2017.11.011
|View full text |Cite
|
Sign up to set email alerts
|

Global exponential stability of memristive Cohen–Grossberg neural networks with mixed delays and impulse time window

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
18
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 33 publications
(18 citation statements)
references
References 43 publications
0
18
0
Order By: Relevance
“…Over the last few years, Cohen-Grossberg neural networks have been paid more and more attention because of their potential applications in various fields such as signal processing, image processing, pattern recognition, associative memory, programming problems, and combinatorial optimization (see [1][2][3][4][5]). Cohen-Grossberg neural networks model was first introduced by Cohen and Grossberg in 1983, which has become one of the most important neural network models (see [6,7]). Compared with recurrent neural networks, Hopfield neural networks, and cellular neural networks, it is more challenging and interesting to build Cohen-Grossberg neural networks model.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Over the last few years, Cohen-Grossberg neural networks have been paid more and more attention because of their potential applications in various fields such as signal processing, image processing, pattern recognition, associative memory, programming problems, and combinatorial optimization (see [1][2][3][4][5]). Cohen-Grossberg neural networks model was first introduced by Cohen and Grossberg in 1983, which has become one of the most important neural network models (see [6,7]). Compared with recurrent neural networks, Hopfield neural networks, and cellular neural networks, it is more challenging and interesting to build Cohen-Grossberg neural networks model.…”
Section: Introductionmentioning
confidence: 99%
“…Based onAssumptions 2,3,4,5,6, and 13, the trivial solution of system(24) is mean-square exponentially input-to-state stable, if there exist positive constants , , V , ( = 1, 2, . .…”
mentioning
confidence: 99%
“…Especially, the studies on memristive CGNNs (MCGNNs) in the stability [27], [29], [30] and synchronization [24], [25], [31], [32] attract more attention. Among them, it is worth studying on synchronization for a class of MCGNNs in the field of secure communications, image and data encryption.…”
Section: Introductionmentioning
confidence: 99%
“…Many scholars have devoted themselves to the study of dynamical behavior of MNNs, such as synchronization [5], stability and stabilization [6], [7], passivity [8]- [12], dispassvity [13], [14], etc. Up to now, many scholars have proposed many kinds of memristor-based neural networks models, such as reaction-diffusion memristor-based neural networks [5], [8], [12], [15], inertial memristor-based neural networks [9], [10], Hopfield memristor-based neural networks [7], VOLUME 4, 2016 Cohen-Grossberg memristor-based neural networks [16]- [18], fractional-order memristor-based neural networks [19]- [23], etc. Synchronization and stabilization are considered to be the most important dynamical behavior of MNNs.…”
Section: Introductionmentioning
confidence: 99%