2016
DOI: 10.1016/j.spa.2015.12.003
|View full text |Cite
|
Sign up to set email alerts
|

Locally stationary Hawkes processes

Abstract: This paper addresses the generalisation of stationary Hawkes processes in order to allow for a time-evolving second-order analysis. Motivated by the concept of locally stationary autoregressive processes, we apply however inherently different techniques to describe the time-varying dynamics of self-exciting point processes. In particular we derive a stationary approximation of the Laplace transform of a locally stationary Hawkes process. This allows us to define a local intensity function and a local Bartlett … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(19 citation statements)
references
References 22 publications
0
19
0
Order By: Relevance
“…Further work is needed here to assess the behaviour of the estimator when correlation structure is present. For instance under self-exciting Hawkes, or Cox processes [Hawkes, 1971, Cox, 1955, or even locally stationary processes [Park et al, 2014, Gibberd and Nelson, 2016, Roueff et al, 2016.…”
Section: Discussionmentioning
confidence: 99%
“…Further work is needed here to assess the behaviour of the estimator when correlation structure is present. For instance under self-exciting Hawkes, or Cox processes [Hawkes, 1971, Cox, 1955, or even locally stationary processes [Park et al, 2014, Gibberd and Nelson, 2016, Roueff et al, 2016.…”
Section: Discussionmentioning
confidence: 99%
“…Plenty of state of the arts have performed inference for a timechanging baseline intensity with a stationary triggering kernel [13,14,27]. For both baseline intensity and triggering kernel being nonstationary, the authors of [24] and [23] provided a general nonparametric estimation theory for the firstand second-order cumulants of a locally stationary Hawkes process. However, this method is inefficient in computation complexity because every point on the two-dimensional covariance function Cov(τ, t) has to be estimated and it is not applicable to real applications.…”
Section: Nonstationary Hawkes Processmentioning
confidence: 99%
“…However, this method is inefficient in computation complexity because every point on the two-dimensional covariance function Cov(τ, t) has to be estimated and it is not applicable to real applications. In this sense, our MRS algorithm can be considered as a "coarser" version of the work in [24]: it combines adjacent small sectors with similar statistical properties into a larger segment and only outputs more heterogeneous segments. Although it is "coarser," the computation complexity is drastically reduced to make it practical.…”
Section: Nonstationary Hawkes Processmentioning
confidence: 99%
“…A similar procedure is developed in [17]. In the case of Figure 3: Difference of the log-likelihood scaled by T between forward and backward time arrows for a HP with an exponential kernel for λ0 = {0.001, 0.0025, 0.0050, 0.0075, 0.0100} and α = {0.010, 0.025, 0.050, 0.075, 0.100}, while β is adjusted to match the desired n. The data points are grouped according to their endogeneity and averaged over 100 runs for each parameter permutation.…”
Section: A Log-likelihoodmentioning
confidence: 99%