1976
DOI: 10.1016/0020-0255(76)90034-7
|View full text |Cite
|
Sign up to set email alerts
|

An application of the information theory to filtering problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
21
0

Year Published

1977
1977
2021
2021

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 21 publications
(21 citation statements)
references
References 7 publications
0
21
0
Order By: Relevance
“…For example, a variety of information theoretic measures of optimality such as the mutual information between the observation and the estimation error are considered in [30], [14] and [8]. In particular, these articles show that many of these measures are optimised by the Kalman filter in the linear Gaussian filtering problem of this article.…”
Section: Introductionmentioning
confidence: 99%
“…For example, a variety of information theoretic measures of optimality such as the mutual information between the observation and the estimation error are considered in [30], [14] and [8]. In particular, these articles show that many of these measures are optimised by the Kalman filter in the linear Gaussian filtering problem of this article.…”
Section: Introductionmentioning
confidence: 99%
“…The above equation can also be expressed in entropy terms as follows (Tomita et al 1976): where, H(Zt) is the entropy of variable Zt;…”
Section: Formulation Of Kalman Filter Estimation Modelmentioning
confidence: 99%
“…In which variable Y is the m-dimensional vector. Tomita et al (1976) used this error entropy concept and estimated the matrices At and St+1 as follows:…”
Section: Formulation Of Kalman Filter Estimation Modelmentioning
confidence: 99%
“…Similar quantification is performed for non-Gaussian signal [9,10] and fractional Gaussian channel [11]. On the other hand, [12] showed that the optimal filter for a linear system that maximizes the mutual information between the observation history for [0, t] and the state value at t is Kalman-Bucy filter; [13] related this mutual information to the Fisher information matrix. Recently, Mitter and Newton [14] presented an expression for the mutual information between the signal path during [s, t], s < t and the observation history, with statistical mechanical interpretation of this expression.…”
Section: Introductionmentioning
confidence: 99%
“…Regarding the quantification of the mutual information, this work first presents the filter form, which is a simple extension of the previous work [12]- [14], by treating the forecast problem as a filtering problem with longer time window. However, it then shows that this form might not be suitable for motion planning for a long-term forecast in three senses: sensitivity to the model accuracy, computational cost, and the lack of on-the-fly knowledge of the accumulated information.…”
Section: Introductionmentioning
confidence: 99%