2007
DOI: 10.1016/j.jneumeth.2006.12.008
|View full text |Cite
|
Sign up to set email alerts
|

Causal entropies—A measure for determining changes in the temporal organization of neural systems

Abstract: We propose a novel measure to detect temporal ordering in the activity of individual neurons in a local network, which is thought to be a hallmark of activity-dependent synaptic modifications during learning. The measure, called Causal Entropy, is based on the time-adaptive detection of asymmetries in the relative temporal patterning between neuronal pairs. We characterize properties of the measure on both simulated data and experimental multiunit recordings of hippocampal neurons from the awake, behaving rat,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2009
2009
2017
2017

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 37 publications
0
11
0
Order By: Relevance
“…The idea that information in spike times relates to an underlying directed relationship has been observed, e.g., in [58] and regarding "causal entropy" in [59,60], which indeed computed entropies of (cross) inter-spike intervals. However to our knowledge, this is the first formulation that computes transfer entropy based on lossless representation of entire spike trains (and is thus a dynamic quantity which captures state-updates rather than static correlations of single spike-time relationships).…”
Section: Application To Spike Trainsmentioning
confidence: 98%
“…The idea that information in spike times relates to an underlying directed relationship has been observed, e.g., in [58] and regarding "causal entropy" in [59,60], which indeed computed entropies of (cross) inter-spike intervals. However to our knowledge, this is the first formulation that computes transfer entropy based on lossless representation of entire spike trains (and is thus a dynamic quantity which captures state-updates rather than static correlations of single spike-time relationships).…”
Section: Application To Spike Trainsmentioning
confidence: 98%
“…More recent developments aim at estimating Granger causality directly from Fourier and wavelet transforms (Dhamala et al, 2008), rely on partial (Frenzel and Pompe, 2007) or conditional mutual information Vejmelka and Paluš, 2008), or make use of entropy estimates (Waddell et al, 2007;Liang, 2008). In order to improve the estimation of transfer entropy (Schreiber, 2000), Lungarella et al (2007) proposed a wavelet-based extension of this measure, and introduced symbolic transfer entropy as a robust and computationally fast method to quantify the dominating direction of information flow between time series.…”
Section: Bivariate Time Series Analysis Techniquesmentioning
confidence: 99%
“…Finally, other measure proposed to estimate coupling between time series is the causal entropy (CauEn) [25]. This index is an asymmetric, time-adaptive, event-based measure of the regularity of the phase-or time-lag with which point i fires after point j.…”
Section: Irregularity Quantificationmentioning
confidence: 99%
“…It is calculated from two components: a non-parametric time-adaptive estimate of the probability density of spike time lag between two points i and j such that i follows j (and, independently, the distribution of j following i), and a cost function estimate of the spread and stability of the distribution. Although a variety of alternatives exists to compute this metric, CauEn can be easily estimated by choosing an event-normalized histogram as the time-adaptive density estimator and the ShEn as the cost function [25].…”
Section: Irregularity Quantificationmentioning
confidence: 99%