1979
DOI: 10.2307/2286743
|View full text |Cite
|
Sign up to set email alerts
|

Robust Estimation of the First-Order Autoregressive Parameter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
70
0

Year Published

1994
1994
2014
2014

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 99 publications
(72 citation statements)
references
References 0 publications
2
70
0
Order By: Relevance
“…The tuning constants are set as CX = 6.0 and C,, = 3.9. Following Denby and Martin (1979), four iterations are calculated with the LS estimates as initial points for the GM estimates with Huber weights. The latter values are then used as starting points for the final GM estimates with Tukey bisquare weights.…”
Section: Monte Carl0 Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The tuning constants are set as CX = 6.0 and C,, = 3.9. Following Denby and Martin (1979), four iterations are calculated with the LS estimates as initial points for the GM estimates with Huber weights. The latter values are then used as starting points for the final GM estimates with Tukey bisquare weights.…”
Section: Monte Carl0 Resultsmentioning
confidence: 99%
“…GENERALIZED-M ESTIMATES Denby and Martin (1979), Miller (1980), and Chang et al (1988) showed that the least squares estimates for linear autoregressive parameters not only lack robustness in terms of variability but also suffer from a severe bias problem when the observations are contaminated by outliers. Therefore it is expected that the existence of additive outliers may also cause similar problems in estimating threshold autoregressive models which are piecewise linear.…”
Section: Through Equations (5)mentioning
confidence: 99%
See 1 more Smart Citation
“…Several robust estimation procedures for autoregressive moving-average model parameters have been proposed along the line of Huber (1964) for location parameters. For example, Denby and Martin (1979) proposed the generalized M-estimates for autoregressive processes. The robustness is achieved by introducing trimmed weights for each summand of score functions.…”
Section: Introductionmentioning
confidence: 99%
“…When the model is an autoregressive moving-average (ARMA) stochastic process, Bustos and Yohai (1986) state several results that show that the additive outliers are more problematic for the estimation of the parameters of an ARMA model. For example, the LS estimators may be severely biased (Denby and Martin, 1979). Innovation outliers occur in an ARMA model when u t has a distribution with tails heavier than those of a normal distribution.…”
Section: Introductionmentioning
confidence: 99%