2012
DOI: 10.3390/e14081469
|View full text |Cite
|
Sign up to set email alerts
|

An Integral Representation of the Relative Entropy

Abstract: Recently the identity of de Bruijn type between the relative entropy and the relative Fisher information with the reference moving has been unveiled by Verdú via MMSE in estimation theory. In this paper, we shall give another proof of this identity in more direct way that the derivative is calculated by applying integrations by part with the heat equation. We shall also derive an integral representation of the relative entropy, as one of the applications of which the logarithmic Sobolev inequality for centered… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2013
2013
2016
2016

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 18 publications
0
7
0
Order By: Relevance
“…An alternative proof of this identity by direct calculation with integrations by part has been given in [5]. It should be noted here that the reference measure does move by the same heat equation in the formula of Lemma 1.…”
Section: Introductionmentioning
confidence: 91%
See 1 more Smart Citation
“…An alternative proof of this identity by direct calculation with integrations by part has been given in [5]. It should be noted here that the reference measure does move by the same heat equation in the formula of Lemma 1.…”
Section: Introductionmentioning
confidence: 91%
“…In this paper, we will treat the Focker-Planck equation with strictly convex potential as our continuity equation, which is because the first natural extension of the heat equation and the similar dissipation formula in Lemma 2 of Vérdu can be derived by the fundamental method, the integration by parts like in [5].…”
Section: Lemmamentioning
confidence: 99%
“…As far as it was possible to determine, the first definition of the relative Fisher information was given by Otto and Villani [36], who defined it for the translationally-invariant case. Furthermore, this expression has been rediscovered or simply used in many applications thereafter in different problems and fields [22,[37][38][39][40][41][42][43][44]. Furthermore, it seems that the first general analysis of the relative Fisher information was presented by the author in [45].…”
Section: Relative Fisher Information Type Imentioning
confidence: 99%
“…Stam's inequality [8,9,40,[47][48][49][50] states a lower bound for Fisher information, which links Fisher information and Shannon's entropy power. However, this expression is limited to the special case where the parameters in the Fisher information expression correspond to a location parameter.…”
Section: Lower Bound For Fisher Informationmentioning
confidence: 99%
“…Recently, the RFI has been the subject of intense investigations in a quest to obtain a better perspective of its physical implications [10][11][12][13][14], and to formally establish its role in information theory and estimation theory [15,16]. Still, many of the fundamental properties and physical implications of the RFI remain uninvestigated.…”
Section: Introductionmentioning
confidence: 99%