2017
DOI: 10.7287/peerj.preprints.3345
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Potential conditional mutual information: Estimators and properties

Abstract: The conditional mutual information I(X; Y |Z) measures the average information that X and Y contain about each other given Z. This is an important primitive in many learning problems including conditional independence testing, graphical model inference, causal strength estimation and time-series problems. In several applications, it is desirable to have a functional purely of the conditional distribution p Y |X,Z rather than of the joint distribution pX,Y,Z . We define the potential conditional mutual informat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 13 publications
0
3
0
Order By: Relevance
“…Using a similar technique to estimate conditional mutual information, Frenzel and Pompe (FP) first, though several other papers as well [17], [18], [19], [20], [21] used…”
Section: Frenzel and Pompe Estimator Of Conditional Mutual Informationmentioning
confidence: 99%
“…Using a similar technique to estimate conditional mutual information, Frenzel and Pompe (FP) first, though several other papers as well [17], [18], [19], [20], [21] used…”
Section: Frenzel and Pompe Estimator Of Conditional Mutual Informationmentioning
confidence: 99%
“…2 simultaneously. First, CMI can measure the nonlinear dependency (Rahimzamani and Kannan, 2017) between two random variables, making it a competent statistic to interpret nonlinear CCA variants. Second, the conditional independent constraint for CCA-based models is automatically satisfied with the minimum CMI, since the optimal, I(X; Y |Z) = 0 (Cover and Thomas, 2012), is achieved with Eq.…”
Section: Minimum Cmi: a New Criterion For Ccamentioning
confidence: 99%
“…4. Directed information measures the amount of information between two random processes [14,15] and is shown as the correct metric in identifying time-series graphical models [16][17][18][19][20][21].…”
Section: Conditional Mutual Information Measures the Amount Of Inform...mentioning
confidence: 99%