2013
DOI: 10.1214/13-aos1145
|View full text |Cite
|
Sign up to set email alerts
|

Quantifying causal influences

Abstract: Many methods for causal inference generate directed acyclic graphs (DAGs) that formalize causal relations between $n$ variables. Given the joint distribution on all these variables, the DAG contains all information about how intervening on one variable changes the distribution of the other $n-1$ variables. However, quantifying the causal influence of one variable on another one remains a nontrivial question. Here we propose a set of natural, intuitive postulates that a measure of causal strength should satisfy… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
259
0
1

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 191 publications
(261 citation statements)
references
References 18 publications
1
259
0
1
Order By: Relevance
“…Despite being interesting and useful metrics, these either fail to accurately measure causation, or are heuristics [8]. One difficultly is that information theory metrics, such as the mutual information, are usually thought to concern statistical correlations and are calculated over some observed distribution.…”
Section: Introductionmentioning
confidence: 99%
“…Despite being interesting and useful metrics, these either fail to accurately measure causation, or are heuristics [8]. One difficultly is that information theory metrics, such as the mutual information, are usually thought to concern statistical correlations and are calculated over some observed distribution.…”
Section: Introductionmentioning
confidence: 99%
“…However, recently, the causal strength C X→Y (X;YjZ) (see SI Appendix, Property S9) for quantifying causal strength was proposed by Janzing et al (18). The causal strength (CS) can be used to quantify causal influences among variables in a network.…”
Section: Discussionmentioning
confidence: 99%
“…In other words, strong dependency between X and Z (or between Y and Z) makes the conditional dependence of X and Y almost invisible when measuring CMI(X;YjZ) only (18). However, PMI(X;YjZ) can measure correctly for this case because the partial independence makes the conditional dependence of X and Y visible again by replacing p(xjz) p(yjz) with p*(xjz) p*(yjz), where p*(xjz) [or p*(yjz)] implicitly includes the association information between X and Y, different from p(xjz) [or p(yjz)].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In order to overcome these limitations of the (conditional) mutual information, in a series of papers [35][36][37][38][39] we have proposed the use of information theory in combination with Pearl's theory of causation [40]. Our approach has been discussed in [41] where a variant of our notion of node exclusion, introduced in [36], has been utilized for an alternative definition. This definition, however, is restricted to direct causal effects and does not capture, in contrast to [35], mediated causal effects.…”
Section: Preface: Information Integration and Complexitymentioning
confidence: 99%