2018
DOI: 10.1109/tsp.2018.2821627
|View full text |Cite
|
Sign up to set email alerts
|

Mutual Information in Frequency and Its Application to Measure Cross-Frequency Coupling in Epilepsy

Abstract: We define a metric, mutual information in frequency (MI-in-frequency), to detect and quantify the statistical dependence between different frequency components in the data, referred to as cross-frequency coupling and apply it to electrophysiological recordings from the brain to infer crossfrequency coupling. The current metrics used to quantify the cross-frequency coupling in neuroscience cannot detect if two frequency components in non-Gaussian brain recordings are statistically independent or not. Our MI-in-… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
50
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(51 citation statements)
references
References 48 publications
0
50
0
Order By: Relevance
“…This highlights the power of MIF to quantify nonlinear relationships across frequencies, which is fundamentally different from the restriction of coherence to samefrequency interactions as made clear by the single frequency indexing in (1). Finally, we mention that coherence (1) has a direct relationship to MIF for linear GPs 5 :…”
Section: In Frequency (Mif)mentioning
confidence: 90%
See 3 more Smart Citations
“…This highlights the power of MIF to quantify nonlinear relationships across frequencies, which is fundamentally different from the restriction of coherence to samefrequency interactions as made clear by the single frequency indexing in (1). Finally, we mention that coherence (1) has a direct relationship to MIF for linear GPs 5 :…”
Section: In Frequency (Mif)mentioning
confidence: 90%
“…In order to estimate MI in the frequency domain between two random processes X(t) and Y(t), we augmented the procedure described in 5 with the multitaper approach 9 . One begins by taking non-overlapping windows of sample paths of both processes, and independence between windows is assumed.…”
Section: In Frequency (Mif)mentioning
confidence: 99%
See 2 more Smart Citations
“…Mutual information (MI) is a well-known information extraction criterion that measures the similarity of information content or mutual dependence between two random variables. It was first conceptualized by Shannon [ 28 ], since then it been widely adopted in a variety of applications, especially in the area of biology [ 29 , 30 ]. The mutual information I between two discrete random variables X and Y can be described as in [ 29 ]: …”
Section: Methodsmentioning
confidence: 99%