ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2020
DOI: 10.1109/icassp40776.2020.9053422
|View full text |Cite
|
Sign up to set email alerts
|

Conditional Mutual Information Neural Estimator

Abstract: Several recent works in communication systems have proposed to leverage the power of neural networks in the design of encoders and decoders. In this approach, these blocks can be tailored to maximize the transmission rate based on aggregated samples from the channel. Motivated by the fact that, in many communication schemes, the achievable transmission rate is determined by a conditional mutual information, this paper focuses on neural-based estimators for this information-theoretic quantity. Our results are b… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 13 publications
(12 citation statements)
references
References 28 publications
1
11
0
Order By: Relevance
“…3 and 4. Here, we have chosen α = α * in (46) so that the value of the critical coupling K c matches the exact result given in (44). As expected from our analysis above, we observe a qualitative difference in behavior for K above vs. below the critical value K c .…”
Section: D Classical Ising Modelsupporting
confidence: 60%
See 2 more Smart Citations
“…3 and 4. Here, we have chosen α = α * in (46) so that the value of the critical coupling K c matches the exact result given in (44). As expected from our analysis above, we observe a qualitative difference in behavior for K above vs. below the critical value K c .…”
Section: D Classical Ising Modelsupporting
confidence: 60%
“…The modification due to Maris and Kadanoff in (41) corresponds to α = 3 2 , which gives a non-trivial fixed point at (42). The value α * of α that would produce the exact critical coupling (44) is slightly greater than this:…”
Section: D Classical Ising Modelmentioning
confidence: 98%
See 1 more Smart Citation
“…Several variants of these bounds have been reviewed in [ 9 ]. Variational bounds are tight, and the estimators proposed in [ 1 , 4 , 10 , 11 ] leverage this property and use neural networks to approximate the bounds and correspondingly the desired information measure. These estimators were shown to be consistent (i.e., the estimation converges asymptotically to the true value) and suitably estimate MI and CMI when the samples are independently and identically distributed (i.i.d.).…”
Section: Introductionmentioning
confidence: 99%
“…However, these mutual information approximations are lower bounds and cannot be used as approximations for the leakage as an upper bound would be needed. A possible workaround for certain channel is given by [13] which shows how a conditional mutual information can be estimated. iii) Reinforcement learning (RL):…”
Section: Introductionmentioning
confidence: 99%