2016
DOI: 10.1002/9781118740712
|View full text |Cite
|
Sign up to set email alerts
|

Introduction to Stochastic Processes With R

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
51
0
3

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 49 publications
(54 citation statements)
references
References 0 publications
0
51
0
3
Order By: Relevance
“…Each of the sample points x1,…,xd of the sample is allowed to take a discrete or continuous time value which denotes the project ages already known from the given repository (D). The Markov chain is then executed a number of times until the chain converges to its limiting distribution to obtain the target sample subset [6]. The selected sample with its respective t states (or ages) and constructed limiting distribution forms the Bellwether which can be used as the moving window.…”
Section: B Markov Chain Monte Carlomentioning
confidence: 99%
See 4 more Smart Citations
“…Each of the sample points x1,…,xd of the sample is allowed to take a discrete or continuous time value which denotes the project ages already known from the given repository (D). The Markov chain is then executed a number of times until the chain converges to its limiting distribution to obtain the target sample subset [6]. The selected sample with its respective t states (or ages) and constructed limiting distribution forms the Bellwether which can be used as the moving window.…”
Section: B Markov Chain Monte Carlomentioning
confidence: 99%
“…Assume there exist a variable 'P' that can be formulated as a matrix of transition probabilities of a Markov chain [6] from the sample space {X (1) ,…,X (t) } where i T denotes the transition states (project ages), then the ij th element, p (k) ij P is the probability that the Markov chain starting from a particular state, ti will transition to tj after k steps. If p (k) ij is homogeneous, then (2) holds and there exist a unique probability matrix, u that can form the ergodic Markov chain (EMC) such that for any o and for large values of u, (3) can be defined as follows:…”
Section: ) If P Denotes a Regular Transition Probability Matrix (Tpmmentioning
confidence: 99%
See 3 more Smart Citations