2012
DOI: 10.1142/s201019451200788x
|View full text |Cite
|
Sign up to set email alerts
|

Eeg Transfer Entropy Tracks Changes in Information Transfer on the Onset of Vision

Abstract: We investigate the pairwise mutual information and transfer entropy of ten-channel, free-running electroencephalographs measured from thirteen subjects under two behavioral conditions: eyes open resting and eyes closed resting. Mutual information measures nonlinear correlations; transfer entropy determines the directionality of information transfer. For all Int. J. Mod. Phys. Conf. Ser. 2012.17:9-18. Downloaded from www.worldscientific.com by 113.167.217.239 on 06/20/16. For personal use only.10 M. D. Madulara… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…As would be anticipated alterations in networks are associated with traumatic brain injury (Cao and Slobounov 2010 ; Nakamura et al 2009 ; Tsirka et al 2011 ; Zouridakis et al 2011 ; Catsellanos et al 2011a , b ). The calculations presented in this paper and in Madulara et al ( 2012 ) suggest that when calculated using an adaptive partition of the joint probability distribution, mutual information, lagged mutual information and transfer entropy can provide computationally efficient, noise-robust metrics for the analysis of CNS small world networks.…”
Section: Discussionmentioning
confidence: 73%
See 1 more Smart Citation
“…As would be anticipated alterations in networks are associated with traumatic brain injury (Cao and Slobounov 2010 ; Nakamura et al 2009 ; Tsirka et al 2011 ; Zouridakis et al 2011 ; Catsellanos et al 2011a , b ). The calculations presented in this paper and in Madulara et al ( 2012 ) suggest that when calculated using an adaptive partition of the joint probability distribution, mutual information, lagged mutual information and transfer entropy can provide computationally efficient, noise-robust metrics for the analysis of CNS small world networks.…”
Section: Discussionmentioning
confidence: 73%
“…This may have been a factor in the Schreiber study. Madulara et al ( 2012 ) calculated transfer entropy using the EEG records analyzed in this paper. Mutual information was generally lower in the eyes open than in the eyes closed condition.…”
Section: Discussionmentioning
confidence: 99%
“…Despite its success in detecting the direction of interactions in the brain, it either makes assumptions about the structure of the interacting systems or the nature of their interactions and as such, it may suffer from the shortcomings of modeling systems/signals of unknown structure (Lainscsek et al, 2013;Sohrabpour et al, 2016;Bonmati, 2018). Even though much has been achieved with the GCA, a different data-driven approach which involves information theoretic measures like Transfer entropy (TE) may play a critical role in elucidating the effective connectivity of non-linear complex systems that the GCA may fail to unearth (Schreiber, 2006;Madulara et al, 2012;Dejman et al, 2017). Mathematically, the TE uses its entropy to quantitatively infer the coupling strength between two variables (Liu and Aviyente, 2012;Shovon et al, 2017) and has the potential for capturing both the linear and non-linear causal interactions effectively.…”
Section: Introductionmentioning
confidence: 99%