2020
DOI: 10.1109/jsait.2020.2991561
|View full text |Cite
|
Sign up to set email alerts
|

The Information Bottleneck Problem and its Applications in Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
94
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 111 publications
(94 citation statements)
references
References 31 publications
0
94
0
Order By: Relevance
“…We then showed that both IB and PF are closely related to several information-theoretic coding problems such as noisy random coding, hypothesis testing against independence, and dependence dilution. While these connections were partially known in previous work (see e.g., [29,30]), we show that they lead to an improvement on the cardinality of T for computing IB. We then turned our attention to the continuous setting where X and Y are continuous random variables.…”
Section: Summary and Concluding Remarksmentioning
confidence: 55%
See 4 more Smart Citations
“…We then showed that both IB and PF are closely related to several information-theoretic coding problems such as noisy random coding, hypothesis testing against independence, and dependence dilution. While these connections were partially known in previous work (see e.g., [29,30]), we show that they lead to an improvement on the cardinality of T for computing IB. We then turned our attention to the continuous setting where X and Y are continuous random variables.…”
Section: Summary and Concluding Remarksmentioning
confidence: 55%
“…where R noisy (D) is given in (29). We observed in Section 2.3 that IB is fully characterized by the mapping D → R noisy (D) and thus by A.…”
Section: Theorem A1 ([112]mentioning
confidence: 96%
See 3 more Smart Citations