2019
DOI: 10.1017/s0269964818000487
|View full text |Cite
|
Sign up to set email alerts
|

Inequalities for the Dependent Gaussian Noise Channels Based on Fisher Information and Copulas

Abstract: Considering the Gaussian noise channel, Costa [4] investigated the concavity of the entropy power when the input signal and noise components are independent. His argument was connected to the first-order derivative of the Fisher information. In real situations, however, the noise can be highly dependent on the main signal. In this paper, we suppose that the input signal and noise variables are dependent. Then, some well-known copula functions are used to define their dependence structure. The first- and second… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

2
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 22 publications
2
1
0
Order By: Relevance
“…Now, under the same conditions as in Corollary 1, according to the relations (51) and (52), the first-order derivative of the Fisher information, simply follows by setting and in (34). This coincides with the result in [4], where a direct proof of (53) is provided.…”
Section: The One-dimensional Casesupporting
confidence: 90%
See 2 more Smart Citations
“…Now, under the same conditions as in Corollary 1, according to the relations (51) and (52), the first-order derivative of the Fisher information, simply follows by setting and in (34). This coincides with the result in [4], where a direct proof of (53) is provided.…”
Section: The One-dimensional Casesupporting
confidence: 90%
“…simply follows by setting m = 1 and p j (y; t) = p (y; t) in (34). This coincides with the result in [4], where a direct proof of (53) is provided.…”
Section: The One-dimensional Casesupporting
confidence: 90%
See 1 more Smart Citation