2017
DOI: 10.1016/j.ejor.2016.06.006
|View full text |Cite|
|
Sign up to set email alerts
|

Relevant states and memory in Markov chain bootstrapping and simulation

Abstract: Markov chain theory is proving to be a powerful approach to bootstrap highly nonlinear time series. In this work we provide a method to estimate the memory of a Markov chain (i.e. its order) and to identify its relevant states. In particular the choice of memory lags and the aggregation of irrelevant states are obtained by looking for regularities in the transition probabilities. Our approach is based on an optimization model. More specifically we consider two competing objectives that a researcher will in gen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 66 publications
0
7
0
Order By: Relevance
“…In this way the resulting groups emerge as the relevant states, that is the states which significantly influence the conditional distribution of the process. In this work we also extend theoretically the analysis in Cerqueti et al, (2010Cerqueti et al, ( , 2012Cerqueti et al, ( , 2013 by introducing L p norm based distance measures. We also show that the minimization of the objective function represented by the distance measure of the partitions, which is based on the transition probabilities of the states, corresponds to the minimization of the information loss function in the sense of Kolmogorov (1965).…”
Section: Introductionmentioning
confidence: 60%
“…In this way the resulting groups emerge as the relevant states, that is the states which significantly influence the conditional distribution of the process. In this work we also extend theoretically the analysis in Cerqueti et al, (2010Cerqueti et al, ( , 2012Cerqueti et al, ( , 2013 by introducing L p norm based distance measures. We also show that the minimization of the objective function represented by the distance measure of the partitions, which is based on the transition probabilities of the states, corresponds to the minimization of the information loss function in the sense of Kolmogorov (1965).…”
Section: Introductionmentioning
confidence: 60%
“…Constraint (9) forces at least one variable t γ to be equal to 1, thus guaranteeing that the diameter of the optimal partition is at least γ . Cerqueti et al (2010) discuss the coherence of some distance indicators to informationtype criteria advanced by Kolmogorov (1965). According to Remark 1, the dissimilarity measure used in the objective function of model (12) respects the conditions of Kolmogorov (1965).…”
Section: An Optimization Approach To the Aggregation Of The Rows Of Amentioning
confidence: 99%
“…It is, in fact, a distance indicator between the rows of matrix M which takes values in the interval [0, 2] (see Cerqueti et al 2010). As we will argue below, the dissimilarity d uv can be viewed as a proxy for the "cost" of putting k-states α u (k) and α v (k) in the same class of a partition, and, indeed, the objective function of our optimization model is based on it.…”
Section: Notation and Definitionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Proposed initially by Efron (1979), bootstrapping is a technique that consists of resampling some given observations for the purposes of obtaining a good estimation of statistical properties of the original population. The bootstrapping approach allows to preserve the 'structural' similarity between the original and resampled data sets, while also ensuring a controlled diversification of the same (Cerqueti et al, 2017). For information regarding procedure details and methodological contributions, the reader is referred to the studies by Freedman (1984), Freedman and Peters (1984), and Efron andTibshirani (1986, 1993).…”
Section: Data Analysis: a Bootstrapping Approachmentioning
confidence: 99%