2010
DOI: 10.1016/j.amc.2010.03.138
|View full text |Cite
|
Sign up to set email alerts
|

On solving integral equations using Markov chain Monte Carlo methods

Abstract: a b s t r a c tWe study regular expressions that use variables, or parameters, which are interpreted as alphabet letters. We consider two classes of languages denoted by such expressions: under the possibility semantics, a word belongs to the language if it is denoted by some regular expression obtained by replacing variables with letters; under the certainty semantics, the word must be denoted by every such expression. Such languages are regular, and we show that they naturally arise in several applications s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
25
0

Year Published

2010
2010
2016
2016

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 39 publications
(26 citation statements)
references
References 14 publications
1
25
0
Order By: Relevance
“…One can now consider constructing importance sampling based solutions to this sequence of expectations as detailed in [36] [Algorithm 1, p.9] and [Algorithm 2, p.12] and [74] [Algorithm 2.1.1] and for which we provide the relevant pseudo code in Algorithm 2. This is summarized according to the path-space based Sequential Importance Sampling (SIS) based approximation to the annual loss distribution, for a Markov chain with initial distribution/density µ(x) > 0 on E and transition kernel M(x, y) > 0 if k(x, y) = 0 and M has absorbing state d / ∈ E such that M(x, d) = P d for any x ∈ E, by the following steps in Algorithm 2.…”
Section: Stochastic Particle Integration Methods As Solutions To Panjmentioning
confidence: 99%
See 2 more Smart Citations
“…One can now consider constructing importance sampling based solutions to this sequence of expectations as detailed in [36] [Algorithm 1, p.9] and [Algorithm 2, p.12] and [74] [Algorithm 2.1.1] and for which we provide the relevant pseudo code in Algorithm 2. This is summarized according to the path-space based Sequential Importance Sampling (SIS) based approximation to the annual loss distribution, for a Markov chain with initial distribution/density µ(x) > 0 on E and transition kernel M(x, y) > 0 if k(x, y) = 0 and M has absorbing state d / ∈ E such that M(x, d) = P d for any x ∈ E, by the following steps in Algorithm 2.…”
Section: Stochastic Particle Integration Methods As Solutions To Panjmentioning
confidence: 99%
“…The framework proposed in [36] and [74] for developing a recursive numerical solution to estimation of such risk measures through estimation of the density of the compound process. In particular we briefly summarize an approach to transform the standard actuarial solution known as the Panjer recursion [75] to a sequence of expectations.…”
Section: Recursions For Loss Distributions: Panjer and Beyondmentioning
confidence: 99%
See 1 more Smart Citation
“…Markov jump systems (MJSs) are special class of hybrid systems with two components which are the mode and the state, may be employed to model the above system phenomenon. In MJSs, the dynamics of the jump modes and continuous states are, respectively, modeled by finite-state Markov chains [9,28] and differential equations. Since the celebrated work of Krasovskii and Lidskii on quadratic control [18] in the early 1960s, MJSs regains increasing interest and there has been a dramatic progress in MJSs control theory.…”
Section: Introductionmentioning
confidence: 99%
“…Notably, with the exception of (Doucet and Tadic, 2004;Hoffman et al, 2007a), the proposed algorithms have all been variants of the expectation-maximization (EM) procedure; see for a description of the EM approach. In the E step, a belief propagation algorithm is used to estimate the marginal distributions of the latent variables.…”
Section: Introductionmentioning
confidence: 99%