2019
DOI: 10.1109/access.2019.2935776
|View full text |Cite
|
Sign up to set email alerts
|

Multistability of Fractional-Order Recurrent Neural Networks With Discontinuous and Nonmonotonic Activation Functions

Abstract: The coexistence of multiple stable equilibria in recurrent neural networks is an important dynamic characteristic for associative memory and other applications. In this paper, the existence and local Mittag-Leffler stability of multiple equilibria are investigated for a class of fractional-order recurrent neural networks with discontinuous and nonmonotonic activation functions. By using Brouwer s fixed point theory, several conditions are established to ensure the existence of 5 n equilibria, in which all the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 35 publications
0
5
0
Order By: Relevance
“…They take an input signal in a neural network and transforms it into a non-linear output [29]. The simplest ones are linear, but because they do not apply radical transformations, they cannot be successfully used for complex problems [30]. The more commonly used activation functions in medical image segmentation are non-linear because they can work with more complicated structures in the data.…”
Section: Activation Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…They take an input signal in a neural network and transforms it into a non-linear output [29]. The simplest ones are linear, but because they do not apply radical transformations, they cannot be successfully used for complex problems [30]. The more commonly used activation functions in medical image segmentation are non-linear because they can work with more complicated structures in the data.…”
Section: Activation Functionsmentioning
confidence: 99%
“…Once the activation function is applied to each neuron, the model can synthesize whether the neuron is valuable to the model to make accurate predictions [29]. If activation functions were removed, the neuronal layers would not be regularized, and the weights between the nodes would not hold appropriate values [30]. Without activation functions, the algorithms risk failing to converge or increasing the training times.…”
Section: Activation Functionsmentioning
confidence: 99%
“…Lorenz discovered the first chaotic attractor in the 1960s [2], Chaos as an important branch of nonlinear system theory, has been developing rapidly and gradually become a hot research topic in the field of modern natural science. In addition, chaos has obtained huge and far-reaching achievements in different research areas, such as stability and bifurcation analysis [3]- [7], coexistence attractor and hidden attractor [8]- [13], chaos control and chaos synchronization [14]- [17], dynamic behavior analysis of memristors, neural network [18]- [22], and so on. After understanding the general laws of chaos, the research of chaos began to develop into the application field [23]- [28].…”
Section: Introductionmentioning
confidence: 99%
“…In [9], fractional-order chaotic systems with completely unknown dynamics and structure are proposed to study power systems. Recently, based on the unique advantages of fractional calculus, fractional-order neural networks (FNNs), as the popular and important kind of networks, are attracted increasing con-cerns [10], [11], [12], [13], [14], [15]. Among these studies, stability and synchronization of FNNs are always major topics and many sufficient conditions have been proposed for the FNNs with or without time delay.…”
Section: Introductionmentioning
confidence: 99%
“…Asymptotic and finite-time cluster synchronization criteria of coupled FNNs with time delay are proposed in [14]. In [15], the existence and local Mittag-Leffler (ML) stability of multiple equilibria are studied for a class of FNNs with discontinuous and nonmonotonic activation functions.…”
Section: Introductionmentioning
confidence: 99%