An Introduction to Transfer Entropy 2016
DOI: 10.1007/978-3-319-43222-9_5
|View full text |Cite
|
Sign up to set email alerts
|

Information Transfer in Canonical Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
92
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 34 publications
(93 citation statements)
references
References 0 publications
1
92
0
Order By: Relevance
“…In the first instance, this simplifies eqs. (21) and (22) (see also [40,[54][55][56][57]). Returning, for continuity, to an expression of paths, x, we can represent any path containing N x spikes starting at time t 0 as x t t0 ≡ {t, {t} …”
Section: Application To Spike Trainsmentioning
confidence: 94%
See 1 more Smart Citation
“…In the first instance, this simplifies eqs. (21) and (22) (see also [40,[54][55][56][57]). Returning, for continuity, to an expression of paths, x, we can represent any path containing N x spikes starting at time t 0 as x t t0 ≡ {t, {t} …”
Section: Application To Spike Trainsmentioning
confidence: 94%
“…(1) represents a rate of a transfer of information per discretized time step [38]. Consequently, without such a fundamental temporal discretization we must initially define a transfer entropy rate in Proposition 1 (see also [39][40][41][42]). We emphasize that this naturally leads to integrated quantities, in the form of functionals of realized paths, which we introduce subsequently (Proposition 2).…”
Section: B Continuous Time Formalismmentioning
confidence: 99%
“…The concept of information processing presented herein is adopted from information dynamics [29][30][31][32], a formalism that quantifies computation in dynamical systems using methods from information theory. In information dynamics, Schreiber's transfer entropy [27] is identified with the information processed between two elements of a dynamical system.…”
Section: Intrinsic Complexitymentioning
confidence: 99%
“…We find very similar values of TE for = 4 and 6 and observe a considerable decrease in TE for < 4 and > 6 (cf. Section 4.2 of [32]), relative to the value of TE for = 5. We therefore consider = 5 as the optimal value of that properly captures the past history of for the results presented here (see Section 3 for a visual explanation of why ∼ 5 represents the optimal history length).…”
Section: Intrinsic Complexitymentioning
confidence: 99%
See 1 more Smart Citation