2013
DOI: 10.1109/tit.2012.2227677
|View full text |Cite
|
Sign up to set email alerts
|

Directed Information, Causal Estimation, and Communication in Continuous Time

Abstract: A notion of directed information between two continuous-time processes is proposed. A key component in the definition is taking an infimum over all possible partitions of the time interval, which plays a role no less significant than the supremum over "space" partitions inherent in the definition of mutual information. Properties and operational interpretations in estimation and communication are then established for the proposed notion of directed information. For the continuous-time additive white Gaussian n… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
38
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 45 publications
(39 citation statements)
references
References 47 publications
(123 reference statements)
1
38
0
Order By: Relevance
“…Interestingly, for our Gaussian channel model, the DI and MI are equal, which at first is somewhat surprising (as often in channels where feedback is employed, DI is strictly less than MI). While this may be able to be shown by relating / extending the proof of [18] [Proposition 3, part 3)] from continuous time to discrete time, and from scalar to vector observations, we prove it directly using linear algebra. …”
Section: Information Gain / Mutual Informationmentioning
confidence: 96%
“…Interestingly, for our Gaussian channel model, the DI and MI are equal, which at first is somewhat surprising (as often in channels where feedback is employed, DI is strictly less than MI). While this may be able to be shown by relating / extending the proof of [18] [Proposition 3, part 3)] from continuous time to discrete time, and from scalar to vector observations, we prove it directly using linear algebra. …”
Section: Information Gain / Mutual Informationmentioning
confidence: 96%
“…We show that they are still valid in the presence of feedback by substituting mutual information with the notion of directed information in some cases as in continuous time developed in [6].…”
Section: Introductionmentioning
confidence: 94%
“…where the last inequality follows by upper bounding the integral over [0, snr 0 ] by the LMMSE bound in (11) and by upper bounding the integral over [snr 0 , snr] using the SCPP bound in (21). Figure 7 shows a plot of C ∞ (snr, snr 0 , β) in (54) normalized by the capacity of the point-to-point channel 1 2 log(1 + snr).…”
Section: 2mentioning
confidence: 99%
“…The I-MMSE relationship has been extended to continuous time channels in [8] and generalized in [18] by using Malliavin calculus. For other continuous time generalizations the reader is referred to [19][20][21]. Finally, Venkat and Weissman [22] dispensed with the expectation and provided a point-wise identity that has given additional insight into this relationship.…”
Section: Introductionmentioning
confidence: 99%