Dynamic complex-valued matrix inversion is often used in the field of mathematics and engineering. Over the past years, many recurrent neural network models have been designed and researched to analyse the process of matrix inversion without noises interference. However, there are many types of uncertain noises in actual model design and analysis. In this article, a novel fully complex-valued and robust Zeroing neural network (CVRZNN) is firstly proposed for calculating the dynamic complex matrix inversion under the interference of external noise environment, and its robustness is analysed and demonstrated in the presence of various types of external noises. Compared with the previous zeroing neural network (ZNN) and the gradient neural network (GNN) for dynamic complex matrix inversion, this novel CVRZNN model has good robustness under three kinds of external noises. Besides, the theoretical analysis shows that the CVRZNN model can globally converge to zero under constant noise. Through comparative simulation results, the excellent performance of the proposed CVRZNN model is obviously demonstrated, which is much better than that of the previous GNN and ZNN models. INDEX TERMS Zeroing neural network (ZNN), gradient neural network, dynamic complex-valued matrix inversion, robustness, external noise.
This paper aims to studying how to solve dynamic Sylvester quaternion matrix equation (DSQME) using the neural dynamic method. In order to solve the DSQME, the complex representation method is firstly adopted to derive the equivalent dynamic Sylvester complex matrix equation (DSCME) from the DSQME. It is proved that the solution to the DSCME is the same with that of the DSQME in essence. Then, a state-of-the-art neural dynamic method is presented to generate a general dynamic-varying parameter zeroing neural network (DVPZNN) model with its global stability being guaranteed by Lyapunov theory. Specifically, when the linear activation function is utilized in the DVPZNN model, the corresponding model (termed LDVPZNN) achieves finite-time convergence, and time range is theoretically calculated. When the nonlinear powersigmoid activation function is utilized in the DVPZNN model, the corresponding model (termed PSDVPZNN) achieves the better convergence as compared with the LDVPZNN model, which is proved in detail. At last, three examples are presented to compare the solution performance of different neural models for the DSQME and the equivalent DSCME, and the results verify the correctness of the theories, and the superiority of the proposed two DVPZNN models.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.