2019
DOI: 10.1007/s11063-019-09983-x
|View full text |Cite
|
Sign up to set email alerts
|

Improved Gradient Neural Networks for Solving Moore–Penrose Inverse of Full-Rank Matrix

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(7 citation statements)
references
References 32 publications
0
7
0
Order By: Relevance
“…As a result, the following conclusions follow: Remark 2. The particular HGZNN(A T A, I, A T ) and GGNN(A T A, I, A T ) designs define the corresponding modifications of the improved GNN design proposed in [26] if A T A is invertible. In the dual case, HGZNN(I, CC T , C T ) and GGNN(I, CC T , C T ) define the corresponding modifications of the improved GNN design proposed in [26] if CC T is invertible.…”
Section: Gznn(a I B) By the Termmentioning
confidence: 99%
See 1 more Smart Citation
“…As a result, the following conclusions follow: Remark 2. The particular HGZNN(A T A, I, A T ) and GGNN(A T A, I, A T ) designs define the corresponding modifications of the improved GNN design proposed in [26] if A T A is invertible. In the dual case, HGZNN(I, CC T , C T ) and GGNN(I, CC T , C T ) define the corresponding modifications of the improved GNN design proposed in [26] if CC T is invertible.…”
Section: Gznn(a I B) By the Termmentioning
confidence: 99%
“…A comparison with the corresponding GNN design was considered. Two improved nonlinear GNN dynamical systems for approximating the Moore-Penrose inverse of full-row or full-column rank matrices were proposed and considered in [26]. GNN-type models for solving matrix equations and computing related generalized inverses were developed in [1,3,13,16,18,20,[27][28][29].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, various nonlinear and linear recurrent neural network (RNN) models have been developed for computing the pseudoinverse of any rectangular matrices (for more details, see [8][9][10][11]). The gradient-based neural network (GNN), whose derivation is based on the gradient of an nonnegative energy function, is an alternative for calculating the Moore-Penrose generalized inverses [12][13][14][15]. These methods for solving the inner inverses and for other generalized inverses of a matrix frequently use the matrix-matrix product operation, and consume a lot of computing time.…”
Section: Introductionmentioning
confidence: 99%
“…Thus, the SLFN structure is selected in this paper for simplicity but with the limitation that it is inapplicable to the Moor–Penrose inverse computation of time‐varying matrices. Unfortunately, the gradient‐descent (GD) method is often adopted as the learning algorithm for the neural networks to solve the constant matrix inversion problems, which suffers from slow convergence and sensitivity to parameters [27]. In addition, the Moore–Penrose inverse computation of the full‐rank matrix and the rank‐deficient matrix are always studied separately in almost all the existing publications [28].…”
Section: Introductionmentioning
confidence: 99%