2006
DOI: 10.1016/j.econlet.2005.12.016
|View full text |Cite
|
Sign up to set email alerts
|

A model-free characterization of causality

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0
1

Year Published

2007
2007
2018
2018

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 13 publications
0
6
0
1
Order By: Relevance
“…Mutual information is non-negative and I(X, X) = H(X). It is also worth noting that such an approach is inherently model-free [39]. Additionally, partial mutual information I(X, Y |Z) denotes the part of mutual information I(X, Y ) that is not in Z and is defined as [58]:…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Mutual information is non-negative and I(X, X) = H(X). It is also worth noting that such an approach is inherently model-free [39]. Additionally, partial mutual information I(X, Y |Z) denotes the part of mutual information I(X, Y ) that is not in Z and is defined as [58]:…”
Section: Methodsmentioning
confidence: 99%
“…This is the most used approach, even though it goes against the assumption of the complexity of those systems [23,24] and the solid evidence of the nonlinearity of financial markets with regards to stock returns [25][26][27][28][29], market index returns [30][31][32][33][34] and currency exchange rate changes [25,[35][36][37][38]. This has recently been addressed by exchanging Pearson's correlation coefficient with a more general (model-free [39]) measure of mutual information, which allows the study to account for nonlinearity and not rely on the assumption of multivariate normality [40,41].…”
Section: Introductionmentioning
confidence: 99%
“…A comprehensive introduction to TE is provided by Bossomaier et al [ 11 ], whereas an overview of causality detection based on information-theoretic approaches in time series analysis can be found in [ 15 ]. A non-parametric characterization of causality relying on conditional entropy was proposed by Baghli [ 16 ].…”
Section: Background: Transfer Information Entropymentioning
confidence: 99%
“…(Diks & Panchenko, 2005) critically discussed the previous tests of (Hiemstra & Jones, 1994). As the most recent development in economics, (Baghli, 2006) proposes information-theoretic statistics for a model-free characterization of causality, based on an evaluation of conditional entropy.…”
Section: Transfer Entropy Other Information-theoretical Measures Andmentioning
confidence: 99%