1979
DOI: 10.1111/j.2517-6161.1979.tb01052.x
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Independence in Statistical Theory

Abstract: Summary Some simple heuristic properties of conditional independence are shown to form a conceptual framework for much of the theory of statistical inference. This framework is illustrated by an examination of the rôle of conditional independence in several diverse areas of the field of statistics. Topics covered include sufficiency and ancillarity, parameter identification, causal inference, prediction sufficiency, data selection mechanisms, invariant statistical models and a subjectivist approach to model‐bu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

6
699
0
7

Year Published

1990
1990
2015
2015

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 1,039 publications
(712 citation statements)
references
References 29 publications
6
699
0
7
Order By: Relevance
“…Now, we can say that a MPP Y(t, m) is independent of a random variable Z if its probability distribution is not a function of Z. Therefore, given that the probability distribution of a MPP can be entirely written in terms of its marks specific hazard functions, using Dawid's notation (see Dawid 1979), we can say that…”
Section: Some Preliminariesmentioning
confidence: 98%
“…Now, we can say that a MPP Y(t, m) is independent of a random variable Z if its probability distribution is not a function of Z. Therefore, given that the probability distribution of a MPP can be entirely written in terms of its marks specific hazard functions, using Dawid's notation (see Dawid 1979), we can say that…”
Section: Some Preliminariesmentioning
confidence: 98%
“…I shall often mix the two notations, i.e., more precisely: K, A J_ L, B/M, C is to mean that, for all K-states D, L-states E, and tog fS, A N D_I_ BN E/CA ~M. For proofs see, e.g., Dawid (1979) or Spohn (1980). In particular Theorem 2(e) will be important; this is a further reason for assuming a strictly positive probability measure.…”
Section: The Conceptual and Formal Frameworkmentioning
confidence: 99%
“…When X, Y and W are discrete variables the previous condition is equivalent to Pr(X = x, Y = y|W = w) = Pr(X = x|W = w)Pr(Y = y|W = w), for every w such that Pr(W = w) > 0. In the later proofs we will apply the contraction property asserting that X ⊥ ⊥ Z|W and X ⊥ ⊥ Y |ZW , jointly considered, are equivalent to X ⊥ ⊥ Y Z|W and other basic properties of conditional independence (Dawid 1979;Lauritzen 1996). In discussing the lumpability definitions in Sects.…”
Section: Introductionmentioning
confidence: 98%
“…Throughout this paper we will use the notation of conditional independence (Dawid 1979), that is, we will write X ⊥ ⊥ Y |W when the random variables X and Y are independent once the value of a third variable W is given. When X, Y and W are discrete variables the previous condition is equivalent to Pr(X = x, Y = y|W = w) = Pr(X = x|W = w)Pr(Y = y|W = w), for every w such that Pr(W = w) > 0.…”
Section: Introductionmentioning
confidence: 99%