2008
DOI: 10.1007/s10463-008-0179-z
|View full text |Cite
|
Sign up to set email alerts
|

Recursive parameter estimation: asymptotic expansion

Abstract: We consider estimation procedures which are recursive in the sense that each successive estimator is obtained from the previous one by a simple adjustment. The model considered in the paper is very general as we do not impose any preliminary restrictions on the probabilistic nature of the observation process and cover a wide class of nonlinear recursive procedures. In this paper we study asymptotic behaviour of the recursive estimators. The results of the paper can be used to determine the form of a recursive … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2009
2009
2017
2017

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 21 publications
0
11
0
Order By: Relevance
“…Asymptotically under suitable regularity conditions, the mean of our gaussian is guaranteed to converge to the true θ . Consistency can be established by applying theorems for the consistency of estimators based on stochastic gradient descent (Fabian, 1978;Sharia, 2007). We used numerical simulations (data not shown) to verify the predictions of these theorems.…”
Section: Representing and Updating The Posteriormentioning
confidence: 99%
“…Asymptotically under suitable regularity conditions, the mean of our gaussian is guaranteed to converge to the true θ . Consistency can be established by applying theorems for the consistency of estimators based on stochastic gradient descent (Fabian, 1978;Sharia, 2007). We used numerical simulations (data not shown) to verify the predictions of these theorems.…”
Section: Representing and Updating The Posteriormentioning
confidence: 99%
“…conditions that guarantee a property of type (2.12) for any u, and also conditions on the growth of the corresponding functions at infinity (see [20] for details). Once the convergence is secured, the rate of convergence and the asymptotic distribution depend on the local behaviour of the corresponding functions (like differentiability of higher order) and the ergodicity of the model (see [20]- [22]). The statistical model described in [20]- [22] are quite general with no specific requirements on the dependence structure and the distribution of the underline process.…”
Section: Stochastic Approximation Type Estimation Algorithmsmentioning
confidence: 99%
“…Once the convergence is secured, the rate of convergence and the asymptotic distribution depend on the local behaviour of the corresponding functions (like differentiability of higher order) and the ergodicity of the model (see [20]- [22]). The statistical model described in [20]- [22] are quite general with no specific requirements on the dependence structure and the distribution of the underline process. The conditions in these works are given in terms of the conditional distributions.…”
Section: Stochastic Approximation Type Estimation Algorithmsmentioning
confidence: 99%
“…Asymptotic behaviour of this type of procedures for non i.i.d. models was studied by a number of authors, see e.g., [7], [9], [18], [24] - [27]. Results in [27] show that to obtain an estimator with asymptotically optimal properties, one has to consider a state-dependent matrix-valued random step-size sequence.…”
Section: Introductionmentioning
confidence: 99%