2020
DOI: 10.1080/03610926.2020.1813305
|View full text |Cite
|
Sign up to set email alerts
|

An extension of entropy power inequality for dependent random variables

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…However, there are several real situations, such as in radar and sonar systems, in which the noise is highly dependent on the transmitted signal [11]. It was illustrated in [16] that, under some assumptions, Shannon’s EPI can hold for weakly dependent random variables; [3] extended the EPI to dependent random variables with arbitrary distibutions; and [10] provided certain conditions under which the conditional EPI can hold for dependent summands as well.…”
Section: Introductionmentioning
confidence: 99%
“…However, there are several real situations, such as in radar and sonar systems, in which the noise is highly dependent on the transmitted signal [11]. It was illustrated in [16] that, under some assumptions, Shannon’s EPI can hold for weakly dependent random variables; [3] extended the EPI to dependent random variables with arbitrary distibutions; and [10] provided certain conditions under which the conditional EPI can hold for dependent summands as well.…”
Section: Introductionmentioning
confidence: 99%
“…The entropy power inequality (EPI) which is viewed as the fundamental property of differential entropy is investigated extensively. More recently, the differential entropy is used as a tool to study uncertainty relations based on entropy power [13][14][15]. Consider a bipartite quantum system with its Hilbert space C m ⊗ C n (m n).…”
Section: Introductionmentioning
confidence: 99%