1997
DOI: 10.1109/3477.558801
|View full text |Cite
|
Sign up to set email alerts
|

Computational capabilities of recurrent NARX neural networks

Abstract: Abstract-Recently, fully connected recurrent neural networks have been proven to be computationally rich-at least as powerful as Turing machines. This work focuses on another network which is popular in control applications and has been found to be very effective at learning a variety of problems. These networks are based upon Nonlinear AutoRegressive models with eXogenous Inputs (NARX models), and are therefore called NARX networks. As opposed to other recurrent networks, NARX networks have a limited feedback… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
178
0
2

Year Published

1998
1998
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 385 publications
(180 citation statements)
references
References 24 publications
0
178
0
2
Order By: Relevance
“…NARXnns have been shown to perform well on problems involving long-term dependencies, and they are capable of simulating universal dynamical systems [52,53]. The feasibility of NARXnns as a nonlinear tool for time series modeling and prediction is key for carrying out long-term time series predictions [54].…”
Section: Narxnn and Its Configuration For Sm Retrievalmentioning
confidence: 99%
“…NARXnns have been shown to perform well on problems involving long-term dependencies, and they are capable of simulating universal dynamical systems [52,53]. The feasibility of NARXnns as a nonlinear tool for time series modeling and prediction is key for carrying out long-term time series predictions [54].…”
Section: Narxnn and Its Configuration For Sm Retrievalmentioning
confidence: 99%
“…In theory, it has been shown that the NARX networks can be used, rather than conventional recurrent networks, without computational loss and that they are equivalent to Turing machines [27]. In this case, u(t) and y(t) represent the input and the output of the network at time t, respectively.…”
Section: Recurrent Narx Neural Networkmentioning
confidence: 99%
“…Several recent papers have considered the nature of deterministic analog (continuous) recurrent neural networks (e.g., [41,43,42,25,22,28,14,44]). These consist of a finite number of neurons.…”
Section: Deterministic Analog Recurrent Networkmentioning
confidence: 99%