“…Even in this case, sophisticated MCMC schemes need to be developed to perform Bayesian inference (Frühwirth-Schnatter and Sögner, 2008;Roberts et al, 2004). However, it is argued in Gander and Stephens (2007) that 'the use of the gamma marginal model appears to be motivated by computational tractability, rather than by any theoretical or empirical reasoning'.…”
Summary. Markov chain Monte Carlo and sequential Monte Carlo methods have emerged as the two main tools to sample from high dimensional probability distributions. Although asymptotic convergence of Markov chain Monte Carlo algorithms is ensured under weak assumptions, the performance of these algorithms is unreliable when the proposal distributions that are used to explore the space are poorly chosen and/or if highly correlated variables are updated independently. We show here how it is possible to build efficient high dimensional proposal distributions by using sequential Monte Carlo methods. This allows us not only to improve over standard Markov chain Monte Carlo schemes but also to make Bayesian inference feasible for a large class of statistical models where this was not previously so. We demonstrate these algorithms on a non-linear state space model and a Lévy-driven stochastic volatility model.
“…Even in this case, sophisticated MCMC schemes need to be developed to perform Bayesian inference (Frühwirth-Schnatter and Sögner, 2008;Roberts et al, 2004). However, it is argued in Gander and Stephens (2007) that 'the use of the gamma marginal model appears to be motivated by computational tractability, rather than by any theoretical or empirical reasoning'.…”
Summary. Markov chain Monte Carlo and sequential Monte Carlo methods have emerged as the two main tools to sample from high dimensional probability distributions. Although asymptotic convergence of Markov chain Monte Carlo algorithms is ensured under weak assumptions, the performance of these algorithms is unreliable when the proposal distributions that are used to explore the space are poorly chosen and/or if highly correlated variables are updated independently. We show here how it is possible to build efficient high dimensional proposal distributions by using sequential Monte Carlo methods. This allows us not only to improve over standard Markov chain Monte Carlo schemes but also to make Bayesian inference feasible for a large class of statistical models where this was not previously so. We demonstrate these algorithms on a non-linear state space model and a Lévy-driven stochastic volatility model.
“…Bollerslev and Zhou (2002), and simulation methods, see e.g. Roberts et al (2004), Frühwirth-Schnatter andSögner (2001), Griffin and Steel (2006), and Aït- Sahalia and Kimmel (2007) and the references therein.…”
Section: Non-parametric Estimation Of the Leverage Effectmentioning
This paper proposes the new concept of stochastic leverage in stochastic volatility models. Stochastic leverage refers to a stochastic process which replaces the classical constant correlation parameter between the asset return and the stochastic volatility process. We provide a systematic treatment of stochastic leverage and propose to model the stochastic leverage effect explicitly, e.g. by means of a linear transformation of a Jacobi process. Such models are both analytically tractable and allow for a direct economic interpretation. In particular, we propose two new stochastic volatility models which allow for a stochastic leverage effect: the generalised Heston model and the generalised Barndorff-Nielsen & Shephard model. We investigate the impact of a stochastic leverage effect in the risk neutral world by focusing on implied volatilities generated by option prices derived from our new models. Furthermore, we give a detailed account on statistical properties of the new models.
“…The autocorrelation function of σ 2 (t) is given by Corr(σ 2 (t), σ 2 (t + s)) = exp{−λs} which does not depend on the Lévy density of z (and so the marginal distribution of σ 2 (t)) but does depend on λ. Its form is not suitable for asset return or stock indices (Barndorff-Nielsen and Shephard 2001) since the autocorrelation tends to have a rapid initial decay followed by a slower decay at longer lags, which has been confirmed by many subsequent applications Stephens 2007a,b, Frühwirth-Schnatter andSögner 2009). A more flexible dependence structure can be created using superpositions of OU processes, which were first studied by Barndorff-Nielsen (2001).…”
Section: Lemma 1 (Bns) Let Z Be a Lévy Process With Positive Incremementioning
This paper describes a Bayesian nonparametric approach to volatility estimation. Volatility is assumed to follow a superposition of an infinite number of Ornstein-Uhlenbeck processes driven by a compound Poisson process with a parametric or nonparametric jump size distribution. This model allows a wide range of possible dependencies and marginal distributions for volatility. The properties of the model and prior specification are discussed, and a Markov chain Monte Carlo algorithm for inference is described. The model is fitted to daily returns of four indices: the Standard and Poors 500, the NASDAQ 100, the FTSE 100, and the Nikkei 225.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.