2000
DOI: 10.1109/78.847778
|View full text |Cite
|
Sign up to set email alerts
|

Efficient training of neural nets for nonlinear adaptive filtering using a recursive Levenberg-Marquardt algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
67
0
4

Year Published

2009
2009
2019
2019

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 162 publications
(76 citation statements)
references
References 18 publications
0
67
0
4
Order By: Relevance
“…The second derivative of the cost function in Equation (4) with respect to , known as the Hessian matrix , can be approximated classically [17], [18] by (7) represents the output sensitivity function defined by partial derivative of the predicted output with respect to the parameters vector as:…”
Section: Proposed Identification Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…The second derivative of the cost function in Equation (4) with respect to , known as the Hessian matrix , can be approximated classically [17], [18] by (7) represents the output sensitivity function defined by partial derivative of the predicted output with respect to the parameters vector as:…”
Section: Proposed Identification Methodsmentioning
confidence: 99%
“…Therefore, we introduce the proposed RLM algorithm that optimizes the following least square cost function , using the forgetting-factor mechanism [17], [18] as:…”
Section: Proposed Identification Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Here a recursive Levenberg Marquardt algorithm proposed by Ngia and Sjöberg [10] is used. This is essentially a regularised implementation of the Recursive Prediction Error algorithm and is defined as follows:…”
Section: Neural Network Implementationmentioning
confidence: 99%