2020
DOI: 10.1007/s00190-020-01376-6
|View full text |Cite
|
Sign up to set email alerts
|

Self-tuning robust adjustment within multivariate regression time series models with vector-autoregressive random errors

Abstract: The iteratively reweighted least-squares approach to self-tuning robust adjustment of parameters in linear regression models with autoregressive (AR) and t-distributed random errors, previously established in Kargoll et al. (in J Geod 92(3):271–297, 2018. 10.1007/s00190-017-1062-6), is extended to multivariate approaches. Multivariate models are used to describe the behavior of multiple observables measured contemporaneously. The proposed approaches allow for the modeling of both auto- and cross-correlations t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 64 publications
0
5
0
Order By: Relevance
“…A second approach to estimate the AT is to model the time series: the AR process as introduced in Box et al (2016) is a prominent example of modelling, see Kargoll et al (2020) and the references inside for applications in geodesy. In that case, a value from a time series is regressed on previous values from that same time series, i.e.…”
Section: The At and The Autoregressive Processmentioning
confidence: 99%
“…A second approach to estimate the AT is to model the time series: the AR process as introduced in Box et al (2016) is a prominent example of modelling, see Kargoll et al (2020) and the references inside for applications in geodesy. In that case, a value from a time series is regressed on previous values from that same time series, i.e.…”
Section: The At and The Autoregressive Processmentioning
confidence: 99%
“…Other methods use the Markov chain Monte Carlo (Olivares and Teferle 2013 ) or the expectation-maximization (EM) algorithm (Kargoll et al. 2020 ).…”
Section: Introductionmentioning
confidence: 99%
“…The joint estimation of both deterministic and stochastic models is often based on the maximum likelihood estimator (MLE) and has been implemented in various software packages such as CATS (Williams 2008), Est_noise (Langbein 2008) and Hector (Bos et al 2008). Other methods use the Markov chain Monte Carlo (Olivares and Teferle 2013) or the expectationmaximization (EM) algorithm (Kargoll et al 2020).…”
Section: Introductionmentioning
confidence: 99%
“…The joint estimation of both deterministic and stochastic models is often based on the Maximum Likelihood Estimator (MLE) and has been implemented in various software packages such as CATS (Williams, 2008), Est noise (Langbein, 2008) and Hector (Bos et al, 2008). Other methods use the Markov chain Monte-Carlo (Olivares and Teferle, 2013) or the Expectation-Maximization (EM) algorithm (Kargoll et al, 2020).…”
Section: Introductionmentioning
confidence: 99%