2006
DOI: 10.1109/sp-m.2006.248712
|View full text |Cite
|
Sign up to set email alerts
|

Signal Processing Using Mutual Information

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2008
2008
2020
2020

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(29 citation statements)
references
References 13 publications
0
29
0
Order By: Relevance
“…As shown in figure 1 the DV partitioning algorithm allows us to partition the state-space associated with a multivariate time series into varying size bins (or hypercubes) for the purpose of density estimation (Hudson 2006). The DV partitioning was previously shown to be effective in calculating transfer entropy (Lee et al 2012; Nemati et al 2013), a statistical measure of the amount of directed entropy transfer between two random processes, and it was shown to have lower computational cost than the competing methods.…”
Section: Methodsmentioning
confidence: 99%
“…As shown in figure 1 the DV partitioning algorithm allows us to partition the state-space associated with a multivariate time series into varying size bins (or hypercubes) for the purpose of density estimation (Hudson 2006). The DV partitioning was previously shown to be effective in calculating transfer entropy (Lee et al 2012; Nemati et al 2013), a statistical measure of the amount of directed entropy transfer between two random processes, and it was shown to have lower computational cost than the competing methods.…”
Section: Methodsmentioning
confidence: 99%
“…To compare the MI across components, the normalized MI was calculated utilizing the entropy of s i (Hudson 2006;Liu et al, 2012):…”
Section: Mutual Informationmentioning
confidence: 99%
“…It is based on the combination of independent component analysis (ICA; Bell and Sejnowski, 1995) and mutual information theory (MI; Hudson, 2006). Thereby, ICA is initially used to decompose data from the sensor level to the component level.…”
Section: Introductionmentioning
confidence: 99%
“…To compare the mutual information across components, the normalized mutual information (MI) was calculated utilizing the entropy of s t (Hudson, 2006;Liu et al, 2012a):…”
Section: Mutual Informationmentioning
confidence: 99%