2022
DOI: 10.1140/epjs/s11734-022-00625-3
|View full text |Cite
|
Sign up to set email alerts
|

Application of variable-order fractional calculus in neural networks: where do we stand?

Abstract: After decades of evolution and advancement, artificial intelligence has profoundly affected all fields of science and engineering, and has started to revolutionize all aspects of today's life. In this regard, neural networks, which are a stepping stone in the search for artificial intelligence, play an important role. Motivated by this, the current special issue aims to explore recent trends and developments in the modeling, analysis, synchronization, and practical application of chaotic variable-order fractio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 29 publications
0
7
0
Order By: Relevance
“…In recently years, the dynamical behavior of fractional-order neural networks (FONN) was widely studied in [22][23][24][25][26][27][28][29][30], especially fractional-order memristive neural networks (FOMNNs) [31][32][33][34][35][36][37][38][39][40][41]. Chen et al investigated Mittag-Leffler synchronization of a FOMNN by using an M-matrix method and set-valued theory in [31].…”
Section: Introductionmentioning
confidence: 99%
“…In recently years, the dynamical behavior of fractional-order neural networks (FONN) was widely studied in [22][23][24][25][26][27][28][29][30], especially fractional-order memristive neural networks (FOMNNs) [31][32][33][34][35][36][37][38][39][40][41]. Chen et al investigated Mittag-Leffler synchronization of a FOMNN by using an M-matrix method and set-valued theory in [31].…”
Section: Introductionmentioning
confidence: 99%
“…One challenge with machine learning methods, such as the kriging algorithm, is their limited performance in beyond-the-data-range scenarios (Yousefpour et al, 2022). However, our proposed approach mitigates this through a correction term.…”
Section: Comparison With Other Techniquesmentioning
confidence: 99%
“…The proposed process for weight evolution of the neural network is described by the following adaptation law equation: . Ŵi = −γ i s it φ i (12) in which γ i denotes a positive design parameter.…”
Section: Super-twisting Controllermentioning
confidence: 99%
“…Recently, some studies have shown that variable-order fractional (VOF) can capture the dynamic of some systems more accurately [12]. The applications of VOF neural networks are vast and promising.…”
Section: Introductionmentioning
confidence: 99%