2004
DOI: 10.1007/978-1-4757-3819-3
|View full text |Cite
|
Sign up to set email alerts
|

Convergence Analysis of Recurrent Neural Networks

Abstract: Softcover reprint of the hardcover 1 st edition 2004All rights reserved. No part ofthis publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photo-copying, microfilming, recording, or otherwise, without the prior written permission ofthe publisher, with the exception of any material supplied specifically for the purpose ofbeing entered and executed on a computer system, for exclusive use by the purchaser ofthe work.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
32
0
2

Year Published

2005
2005
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 72 publications
(34 citation statements)
references
References 0 publications
0
32
0
2
Order By: Relevance
“…Since (x(t), S(t)) is bounded, by using the same arguments appeared in [12,13], this theorem can be proved easily.…”
Section: Complete Convergencementioning
confidence: 84%
See 2 more Smart Citations
“…Since (x(t), S(t)) is bounded, by using the same arguments appeared in [12,13], this theorem can be proved easily.…”
Section: Complete Convergencementioning
confidence: 84%
“…It is well known that the activation functions are important factors affecting the dynamic behavior of neural networks. Neural networks with unsaturated linear activation functions have got extensive interest recently, see for examples [2,4,5,10,12,13]. Such neural networks have potential importance in many applications.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Obviously, the algorithms have many equilibria and, thus, the study of dynamics belongs to a multistability problem [27]. According to the Lyapunov indirect method, an equilibrium of an algorithm is stable if the absolute of each eigenvalue of the Jacobian matrix of the algorithm at this point is less than 1 [16].…”
Section: Definition 3 a Point W * ∈ R N Is Called An Equilibrium Of (mentioning
confidence: 99%
“…But in many real applications, biological systems are no longer globally stable, thus more appropriate notions of stability are needed to deal with multistable networks. Yi and Tan (2004) investigated three important properties: boundedness, attractivity and complete convergence of a multistable networks. Liao and Wang (2003) addressed the global dissipativity of a general class of continuous-time recurrent neural networks.…”
Section: Introductionmentioning
confidence: 99%