2022
DOI: 10.3390/math10173090
|View full text |Cite
|
Sign up to set email alerts
|

Fixed-Time Convergent Gradient Neural Network for Solving Online Sylvester Equation

Abstract: This paper aims at finding a fixed-time solution to the Sylvester equation by using a gradient neural network (GNN). To reach this goal, a modified sign-bi-power (msbp) function is presented and applied on a linear GNN as an activation function. Accordingly, a fixed-time convergent GNN (FTC-GNN) model is developed for solving the Sylvester equation. The upper bound of the convergence time of such an FTC-GNN model can be predetermined if parameters are given regardless of the initial conditions. This point is c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
14
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(14 citation statements)
references
References 40 publications
0
14
0
Order By: Relevance
“…So far, it can be clearly obtained that for the continuous CLME (5), the state matrix P i (t m ) will globally converge to the theoretical solution P * i . In summary, by selecting an appropriate activation function array Φ i (⋅), the improved ZNN model (16) can converge to the theoretical solution of the continuous CLME (5). Thus, the theorem is completed.…”
Section: Theoremmentioning
confidence: 89%
See 2 more Smart Citations
“…So far, it can be clearly obtained that for the continuous CLME (5), the state matrix P i (t m ) will globally converge to the theoretical solution P * i . In summary, by selecting an appropriate activation function array Φ i (⋅), the improved ZNN model (16) can converge to the theoretical solution of the continuous CLME (5). Thus, the theorem is completed.…”
Section: Theoremmentioning
confidence: 89%
“…Consider the improved ZNN model (16), assume that the Markovian jump system (1-2) is stochastically stable. For any given initial condition P i (t 0 ) ∈ ℝ n×n , i ∈ 𝕀 [1, N ], select Φ i (⋅) as a monotonically increasing odd activation function array, then the state matrices P i (t m ) ∈ ℝ n×n , i ∈ 𝕀 [1, N ], globally converge to the unique positive definite solution of the continuous CLME (5).…”
Section: Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…The influence of AFs on the convergence performance of a GNN design for solving the matrix equation AXB + X = C was investigated in [31]. A fixed-time convergent GNN for solving the Sylvester equation was investigated in [32]. Moreover, noise-tolerant GNN models equipped with a suitable activation function (AF) able to solve convex optimization problems were developed in [33].…”
Section: Introductionmentioning
confidence: 99%
“…As a result, neural dynamic methods are now considered a powerful alternative to online computation of matrix problems due to their parallel distribution properties and the convenience of hardware implementation [15,16]. As a typical RNN, the gradient-based neural network (GNN) is designed to solve the minimization problems [8,[17][18][19]. However, when coping with time-dependent cases, GNN produces relatively large lag errors [20].…”
Section: Introductionmentioning
confidence: 99%