2016
DOI: 10.1162/neco_a_00821
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Neural Network for Computing Outer Inverse

Abstract: Two linear recurrent neural networks for generating outer inverses with prescribed range and null space are defined. Each of the proposed recurrent neural networks is based on the matrix-valued differential equation, a generalization of dynamic equations proposed earlier for the nonsingular matrix inversion, the Moore-Penrose inversion, as well as the Drazin inversion, under the condition of zero initial state. The application of the first approach is conditioned by the properties of the spectrum of a certain … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
4
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
9
1

Relationship

2
8

Authors

Journals

citations
Cited by 32 publications
(6 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…The dynamic equation and induced gradient recurrent neural network for computing the Drazin inverse were defined in [24]. Two gradient-based RNNs for generating outer inverses with prescribed range and null space in the time-invariant case were introduced in [25]. Two additional dynamic state equations and corresponding gradient-based RNNs for generating the class of outer inverses of time-invariant real matrices were proposed in [26].…”
Section: Proposition 2 (Urquhart Formula) Let ∈ C 푚×푛 푟mentioning
confidence: 99%
“…The dynamic equation and induced gradient recurrent neural network for computing the Drazin inverse were defined in [24]. Two gradient-based RNNs for generating outer inverses with prescribed range and null space in the time-invariant case were introduced in [25]. Two additional dynamic state equations and corresponding gradient-based RNNs for generating the class of outer inverses of time-invariant real matrices were proposed in [26].…”
Section: Proposition 2 (Urquhart Formula) Let ∈ C 푚×푛 푟mentioning
confidence: 99%
“…In addition, we define a hybrid method which starts from the ZNN model (2.3) and finishes with (2.10). Finally, GNN denotes the gradient based neural network from [30] in the nonsingular case, corresponding to the case G = A T in the RNN1 model. The Simulink implementation of the ZNNNM model (2.3), restated in the equivalent forṁ…”
Section: Neural Network Architecturementioning
confidence: 99%
“…The study of generalized inverses of matrices has been a very important research field since the middle of last century and remains one of the most active research branches in the world [1][2][3]. Generalized inverses, including the weighted pseudoinverse, have numerous applications in various fields, such as control, networks, statistics, and econometrics [4][5][6][7]. The ℳℒ -weighted pseudoinverse of m × n matrix 𝒦 with the entries of two weight matrices ℳ and ℒ (with order s × m and l × n, respectively) is defined as…”
Section: Introductionmentioning
confidence: 99%