2016
DOI: 10.1016/j.csda.2014.07.006
|View full text |Cite
|
Sign up to set email alerts
|

The exact Gaussian likelihood estimation of time-dependent VARMA models

Abstract: An algorithm for the evaluation of the exact Gaussian likelihood of an rdimensional vector autoregressive-moving average (VARMA) process of order (p, q), with time-dependent coefficients, including a time dependent innovation covariance matrix, is proposed. The elements of the matrices of coefficients and those of the innovation covariance matrix are deterministic functions of time and assumed to depend on a finite number of parameters. These parameters are estimated by maximizing the Gaussian likelihood funct… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 30 publications
0
4
0
Order By: Relevance
“…Additionally, [16] provided a better foundation for the asymptotic theory for array processes, a theorem for a reduction of the order of moments from 8 to slightly more than 4 and tools for obtaining the asymptotic covariance matrix of the estimator. In [17], there was an example of vector tdAR and tdMA models on monthly log returns of IBM stock and the S&P500 index from January 1926 to December 1999, treated first in [18].…”
Section: Let φmentioning
confidence: 99%
“…Additionally, [16] provided a better foundation for the asymptotic theory for array processes, a theorem for a reduction of the order of moments from 8 to slightly more than 4 and tools for obtaining the asymptotic covariance matrix of the estimator. In [17], there was an example of vector tdAR and tdMA models on monthly log returns of IBM stock and the S&P500 index from January 1926 to December 1999, treated first in [18].…”
Section: Let φmentioning
confidence: 99%
“…The simulation experiment is performed in Matlab by using the program, which we call AJM, described in Alj et al (2015c) and based on a special case of tdVAR(1) process defined in (4.4)-(4.5), with A ′ 11 = 0.8 and A ′ 22 = −0.9 and (ǫ t1 , ǫ t2 ) T has a bivariate normal distribution with covariance matrix Σ = I 2 . A simulated series using these specifications is shown in Fig.…”
Section: Example 1: Tdvar(1) a Generalization Of Kwoun And Yajima (1986)mentioning
confidence: 99%
“…There is no assumption of stationarity but, although it is not illustrated in our examples, there is an adjustment in the asymptotic theory for allowing non-normal observations. One major advantage of QMLE is that the Gaussian likelihood function can be computed exactly, with an efficient algorithm, Alj et al (2015c), and this is very important for short time series. The main task in the Azrak-Mélard approach, hence also in our extension, consists in checking conditions from two crucial theorems in Klimko & Nelson (1978), which respectively ensure existence of an almost surely (a.s.) consistent estimator and prove asymptotic normality of that estimator, whilst providing the asymptotic covariance matrix.…”
Section: Introductionmentioning
confidence: 99%
“…To cite few examples of nonlinear multivariate processes, let us mention the self-exciting threshold vector autoregressive [seeTsay (1998)], the smooth transition vector autoregressive(Camacho 2004), the random coefficient VARMA(Alj et al 2014).…”
mentioning
confidence: 99%