2015
DOI: 10.3390/e17064173
|View full text |Cite
|
Sign up to set email alerts
|

Contribution to Transfer Entropy Estimation via the k-Nearest-Neighbors Approach

Abstract: This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) method. To this end, we first investigate the estimation of Shannon entropy involving a rectangular neighboring region, as suggested in already existing literature, and develop two kinds of entropy estimators. Then, applying the widely-used error cancellation approach to these entropy estimators, we propose two novel transfer entropy estimators, implying no extra computational cost compared to existing similar k-NN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
15
0
3

Year Published

2017
2017
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 27 publications
(19 citation statements)
references
References 41 publications
1
15
0
3
Order By: Relevance
“…However, with TE-kraskov, a very slow convergence of information exchange with increasing time series length is noted. This behavior was reported in the study by [ 49 ]. In summary, for this system, IF-linear is able to detect the asymmetry in the information exchange while TE-linear fails.…”
Section: Resultssupporting
confidence: 84%
“…However, with TE-kraskov, a very slow convergence of information exchange with increasing time series length is noted. This behavior was reported in the study by [ 49 ]. In summary, for this system, IF-linear is able to detect the asymmetry in the information exchange while TE-linear fails.…”
Section: Resultssupporting
confidence: 84%
“…Moreover, it is sensitive to the size of bins used. Other nonparametric entropy estimation methods have been also used for computing the TE [ 21 , 22 , 23 ]: kernel density estimation methods, nearest-neighbor, Parzen, neural networks, etc.…”
Section: Background: Transfer Information Entropymentioning
confidence: 99%
“…(1) by recognizing it as the expectation of the logarithm of the Radon-Nikodym derivative of a given conditional probability measure with respect to a distinct, but equivalent, 2 , conditional probability measure (as observed in [36]). We point out that the Radon-Nikodym derivative serves as the density of a measure with respect to another and can function as a generalized Jacobian, changing between those measures under an integral, analogously to a normal derivative.…”
Section: Transfer Entropy In Continuous Timementioning
confidence: 99%