2007
DOI: 10.1103/physreve.76.026209
|View full text |Cite
|
Sign up to set email alerts
|

Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data

Abstract: Commonly used dependence measures, such as linear correlation, cross-correlogram, or Kendall's , cannot capture the complete dependence structure in data unless the structure is restricted to linear, periodic, or monotonic. Mutual information ͑MI͒ has been frequently utilized for capturing the complete dependence structure including nonlinear dependence. Recently, several methods have been proposed for the MI estimation, such as kernel density estimators ͑KDEs͒, k-nearest neighbors ͑KNNs͒, Edgeworth approximat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
166
0

Year Published

2011
2011
2017
2017

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 181 publications
(169 citation statements)
references
References 27 publications
3
166
0
Order By: Relevance
“…This paper is the natural follow-up of PP12 [12], studying now the statistics (mean or bias, variance and distribution) of the MinMI estimation errors: min, H is the ME estimation issued from N-sized samples of iid outcomes. Those errors are roughly similar to those of MI and entropy generic estimator's errors (see [13,14] for a thorough review and performance comparisons between MI estimators). Their mean (bias), variance and higher-order moments are written in terms of 1 N  powers, thus covering intermediate and asymptotic N ranges [15], with specific applications in neurophysiology [16,17,18].…”
Section: The State Of the Artmentioning
confidence: 49%
“…This paper is the natural follow-up of PP12 [12], studying now the statistics (mean or bias, variance and distribution) of the MinMI estimation errors: min, H is the ME estimation issued from N-sized samples of iid outcomes. Those errors are roughly similar to those of MI and entropy generic estimator's errors (see [13,14] for a thorough review and performance comparisons between MI estimators). Their mean (bias), variance and higher-order moments are written in terms of 1 N  powers, thus covering intermediate and asymptotic N ranges [15], with specific applications in neurophysiology [16,17,18].…”
Section: The State Of the Artmentioning
confidence: 49%
“…, R k }, e.g., Both entropy and conditional entropy are related to mutual information through the following expression: I(R 1 ; R 2 ) = H(R 1 ) − H(R 1 |R 2 ). We have favoured mutual information over other measures of statistical dependence, such as correlation coefficients, for its equitability, i.e., its ability to detect general, not only linear or monotonic, dependence Khan et al (2007); Kinney and Atwal (2014). We have chosen however to present results in this paper in terms of conditional entropy, which is trivial to obtain from the corresponding mutual information.…”
Section: Evaluating the Feasibility Of Communication Inference In Onlmentioning
confidence: 99%
“…We find NCIE (Wang et al 2005) to be a very robust measure for both linearly and nonlinearly correlated datasets, which has been applied to the analysis of neurophysiological signals (Pereda et al 2005), the quantification of the dependence among noisy data (Khan et al 2007), etc. Therefore, we adopt NCIE as a correlation measure in objective reduction and study its impact on online objective reduction approaches.…”
Section: Introductionmentioning
confidence: 99%