2010
DOI: 10.3103/s1066530710020055
|View full text |Cite
|
Sign up to set email alerts
|

Efficient on-line estimation of autoregressive parameters

Abstract: New procedures for estimating autoregressive parameters in AR(m) models are proposed. The proposed method allows for incorporation of auxiliary information into the estimation process and, under certain regularity conditions are consistent and asymptotically efficient. Also, these procedures are naturally on-line and do not require storing all the data. Theoretical results are presented in the case when m = 1. Two important particular cases are considered in details: linear procedures and likelihood procedures… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
6
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 21 publications
0
6
0
Order By: Relevance
“…Performing this correction requires knowledge of the AR(1) parameter φ. If it is unknown it can be estimated robustly either on a batch of the data or sequentially using SA-estimates (see (Sharia, 2010))…”
Section: Case 2: Temporal Dependencementioning
confidence: 99%
“…Performing this correction requires knowledge of the AR(1) parameter φ. If it is unknown it can be estimated robustly either on a batch of the data or sequentially using SA-estimates (see (Sharia, 2010))…”
Section: Case 2: Temporal Dependencementioning
confidence: 99%
“…The analysis is illustrated in the case of an AR(1) process. The approach, based on Robbins–Monro stochastic approximation, is generalized by Sharia () to m ‐parameter truncation estimation (see also Sharia, , where other references can be found). It is partially illustrated on AR( m ) estimation.…”
Section: Introductionmentioning
confidence: 99%
“…), z t in () for γ t = 1/ t , and use Theorems 1 and 4 of Ljung () by checking a set of conditions, denoted by C , Appendix . Like Ouakasse and Mélard (), we follow Benveniste et al () to analyze a simplified algorithm, unlike Sharia, (). Note that the fully parametrized models was not considered before, even for the original RML algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…Such a procedure is obviously consistent since θt ∈ [ θt − δ t , θt + δ t ] and θt ± δ t → θ. However, to construct an efficient estimator, care should be taken to ensure that the truncation intervals do not shrink to θt too rapidly, for otherwise θt will have the same asymptotic properties as θt (see [28] for details in the case of AR processes). Since this paper is concerned with the convergence, details of this application is not discussed here.…”
Section: Introductionmentioning
confidence: 99%
“…For example, an idea of truncations with shrinking bounds goes back to [10] and [13]. Truncations with expanding bounds were considered in [1] and also, in the context of recursive parametric estimation, in [23] (see also [28]). Truncations with adaptive truncation sets of the Robbins-Monro SA were introduced by Chen and Zhu in [5], and further explored and extended in [6], [2], [30], [31], [17].…”
Section: Introductionmentioning
confidence: 99%