2019
DOI: 10.1016/j.spl.2018.12.011
|View full text |Cite
|
Sign up to set email alerts
|

On mutual information estimation for mixed-pair random variables

Abstract: We study the mutual information estimation for mixed-pair random variables. One random variable is discrete and the other one is continuous. We develop a kernel method to estimate the mutual information between the two random variables. The estimates enjoy a central limit theorem under some regular conditions on the distributions. The theoretical results are demonstrated by simulation study.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…Mutual information is a good measure of a nonlinear relationship. , In probability theory, the mutual information measures the mutual dependence between the two variables, which indicates how much information is communicated in one random variable about another. For two random variables x and y with the given n samples { x 1 , x 2 , ···, x n } and { y 1 , y 2 , ···, y n }, the mutual information between x and y is defined as where p ( x i ) and p ( y j ) are the marginal probability density function of x and y , respectively, while p ( x i , y i ) is the joint probability density function of x and y .…”
Section: The Proposed Tslkpca Methodsmentioning
confidence: 99%
“…Mutual information is a good measure of a nonlinear relationship. , In probability theory, the mutual information measures the mutual dependence between the two variables, which indicates how much information is communicated in one random variable about another. For two random variables x and y with the given n samples { x 1 , x 2 , ···, x n } and { y 1 , y 2 , ···, y n }, the mutual information between x and y is defined as where p ( x i ) and p ( y j ) are the marginal probability density function of x and y , respectively, while p ( x i , y i ) is the joint probability density function of x and y .…”
Section: The Proposed Tslkpca Methodsmentioning
confidence: 99%
“…which many estimation methods including our own, rely on. However, Beknazaryan et al (2019) give a simple sufficient condition for the decomposition in Eq. ( 2) to hold, which they term the good mixed-pair assumption.…”
Section: Estimation Challengesmentioning
confidence: 99%
“…Another challenge for estimating mutual information in our case is due to the nature of W. Several mixed-pair estimation methods, including Beknazaryan et al (2019), may not apply to our case because they require estimating H(P t | W = w) for each given value of W separately; see App. A for more details.…”
Section: Estimation Challengesmentioning
confidence: 99%
“…A current limitation of the algorithm is that it is limited to homogeneous data sets and can not be applied to mixed data types. Even though though initial proposals for mixed-type MI estimators exist (e.g., Ross (2014); Gao et al (2017); Rahimzamani et al (2018); Beknazaryan et al (2019)), their integration into the proposed forward-selection algorithm is not straightforward and is subject to future work.…”
Section: Choice Of Estimators For CMI Estimationmentioning
confidence: 99%