2019
DOI: 10.3390/e21101022
|View full text |Cite
|
Sign up to set email alerts
|

On Data-Processing and Majorization Inequalities for f-Divergences with Applications

Abstract: This paper is focused on derivations of data-processing and majorization inequalities for f -divergences, and their applications in information theory and statistics. For the accessibility of the material, the main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications. One application refers to the performance analysis of list decoding with either fixed or variable list sizes; so… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
20
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 19 publications
(21 citation statements)
references
References 61 publications
1
20
0
Order By: Relevance
“…The results we obtain in this paper are closely related to the contents given in [1][2][3][4][5]. Moreover, some related results with the present topic can also be found in [10,11,27,41,42].…”
Section: Introductionsupporting
confidence: 83%
See 2 more Smart Citations
“…The results we obtain in this paper are closely related to the contents given in [1][2][3][4][5]. Moreover, some related results with the present topic can also be found in [10,11,27,41,42].…”
Section: Introductionsupporting
confidence: 83%
“…If we reverse the sign of inequalities in (41) and (43), then inequalities (42) and (44) are also reversed.…”
Section: Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…Since the vector-skew Jensen divergence is an f-divergence, we easily obtain Fano and Pinsker inequalities following [ 32 ], or reverse Pinsker inequalities following [ 33 , 34 ] (i.e., upper bounds for the vector-skew Jensen divergences using the total variation metric distance), data processing inequalities using [ 35 ], etc.…”
Section: Extending the Jensen–shannon Divergencementioning
confidence: 99%
“…Polyanskiy–Verdú [ 57 ] showed a lower bound on Sibson’s -mutual information by using the data processing lemma for the Rényi divergence. Recently, Sason [ 58 ] generalized Fano’s inequality with list decoding via the strong data processing lemma for the f -divergences.…”
Section: Introductionmentioning
confidence: 99%