1988
DOI: 10.1016/0893-6080(88)90007-x
|View full text |Cite
|
Sign up to set email alerts
|

Generalization of backpropagation with application to a recurrent gas market model

Abstract: Backpropagation is often viewed as a method for adapting artificial neural networks to classify patterns. Based on parts of the book by Rumelhart and colleagues, many authors equate backpropagation with the generalized delta rule applied to fully-connected feedforward networks. This paper will summarize a more general Jbrmulation of backpropagation, developed in 1974, which does more justice to the roots of the method in numerical analysis and statistics, and also does more justice to creative approaches expre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
370
0
6

Year Published

1992
1992
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 799 publications
(376 citation statements)
references
References 9 publications
0
370
0
6
Order By: Relevance
“…A paper of 1986 significantly contributed to the popularisation of BP for NNs , experimentally demonstrating the emergence of useful internal representations in hidden layers. See generalisations for sequence-processing recurrent NNs (e.g., Williams, 1989;Robinson and Fallside, 1987;Werbos, 1988;Zipser, 1988, 1989b,a;Rohwer, 1989;Pearlmutter, 1989;Gherrity, 1989;Williams and Peng, 1990;Schmidhuber, 1992a;Pearlmutter, 1995;Baldi, 1995;Kremer and Kolen, 2001;Atiya and Parlos, 2000), also for equilibrium RNNs (Almeida, 1987;Pineda, 1987) with stationary inputs.…”
Section: -1981 and Beyond: Development Of Backpropagation (Bp) Fomentioning
confidence: 99%
“…A paper of 1986 significantly contributed to the popularisation of BP for NNs , experimentally demonstrating the emergence of useful internal representations in hidden layers. See generalisations for sequence-processing recurrent NNs (e.g., Williams, 1989;Robinson and Fallside, 1987;Werbos, 1988;Zipser, 1988, 1989b,a;Rohwer, 1989;Pearlmutter, 1989;Gherrity, 1989;Williams and Peng, 1990;Schmidhuber, 1992a;Pearlmutter, 1995;Baldi, 1995;Kremer and Kolen, 2001;Atiya and Parlos, 2000), also for equilibrium RNNs (Almeida, 1987;Pineda, 1987) with stationary inputs.…”
Section: -1981 and Beyond: Development Of Backpropagation (Bp) Fomentioning
confidence: 99%
“…All of these are the basis for the present forest models and their simulations; but since the last two decades, machine learning models have drawn attention and have established themselves as serious contenders to classical statistical models in the forecasting community [1] These models, also called black-box or data-driven models [11] are examples of nonparametric nonlinear models which use only historical data to learn the stochastic dependency between the past and the future. For instance, Werbos found that Artificial Neural Networks (ANNs) outperforms the classical statistical methods such as linear regression and Box-Jenkins approaches [18,19] A similar study has been conducted by Lapedes and Farber (1987) who conclude that ANNs can be successfully used for modelling and forecasting nonlinear time series. Later, others models appeared such as decision trees, support vector machines and nearest neighbour regression [2,7] Moreover, the empirical accuracy of several machine learning models has been explored in a number of forecasting competitions under different data conditions (e.g.…”
Section: Litrature Reviewmentioning
confidence: 97%
“…This leads rnns to have an internal state, which changes over time even if the input does not change. Backpropagation through time is a generalization of backpropagation, enabling this gradient-based method to be applied to rnns (Werbos, 1988). Due to the cycles in rnns, and the internal state that arises from them, rnns cannot be captured by the unified model.…”
Section: Backpropagation For Recurrent Neural Network (Rnns)mentioning
confidence: 99%