2008
DOI: 10.1098/rspa.2007.0196
|View full text |Cite
|
Sign up to set email alerts
|

A computationally efficient estimator for mutual information

Abstract: Mutual information quantifies the determinism that exists in a relationship between random variables, and thus plays an important role in exploratory data analysis. We investigate a class of non-parametric estimators for mutual information, based on the nearest neighbour structure of observations in both the joint and marginal spaces. Unless both marginal spaces are one-dimensional, we demonstrate that a well-known estimator of this type can be computationally expensive under certain conditions, and propose a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
47
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 35 publications
(47 citation statements)
references
References 10 publications
0
47
0
Order By: Relevance
“…The process of mutual information involves the estimation of probability density, which is a computationally expensive process. However, the nonparametric estimator for mutual information, presented in [23], limits the computational complexity to O(NlogN ), where N is the number of observations.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The process of mutual information involves the estimation of probability density, which is a computationally expensive process. However, the nonparametric estimator for mutual information, presented in [23], limits the computational complexity to O(NlogN ), where N is the number of observations.…”
Section: Discussionmentioning
confidence: 99%
“…In the literature, there are many algorithms to estimate the probability distribution. However, we use the nonparametric estimators presented in [23]. The approach proposed here is not constrained to any underlying assumption on density (differentiable, parametric, etc.…”
Section: Density Estimationmentioning
confidence: 99%
“…The KNN approach [29], [30] estimates mutual information based on the metric properties of nearest neighbor balls in both the joint and marginal spaces. This approach exploits the method proposed by Kozachenko and Leonenko [31] to estimate the joint and marginal entropies of random variables for computing MI as defined in (3).…”
Section: A Non-parametric K-nearest Neighbor Approachmentioning
confidence: 99%
“…Hence, a trade-off between the sampling and systematic errors should be taken into account for choosing the parameter (which determines the size of the th nearest neighbor ball) appropriately to ensure that the total error is small. The reader is referred to [29], [30] for further information regarding the procedure. In this work, we use the KNN estimator of Kraskov et al [29] where the original implementation of this algorithm can be found in [32].…”
Section: A Non-parametric K-nearest Neighbor Approachmentioning
confidence: 99%
“…A very good estimator based on k-nearest neighbor statistics is proposed in [8]. A computationally efficient modification of this method appeared recently in [9]. The estimation of mutual information sets in the broader context of the estimation of information-type measures such as entropy, Kullback-Leibler distance, divergence functionals, Rényi entropy.…”
Section: Introductionmentioning
confidence: 99%