2020
DOI: 10.1109/access.2020.2986420
|View full text |Cite
|
Sign up to set email alerts
|

Analytic Minimum Mean-Square Error Bounds in Linear Dynamic Systems With Gaussian Mixture Noise Statistics

Abstract: Using state-space representation, mobile object positioning problems can be described as dynamic systems, with the state representing the unknown location and the observations being the information gathered from the location sensors. For linear dynamic systems with Gaussian noise, the Kalman filter provides the Minimum Mean-Square Error (MMSE) state estimation by tracking the posterior. Hence, by approximating non-Gaussian noise distributions with Gaussian Mixtures (GM), a bank of Kalman filters or Gaussian Su… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 50 publications
0
4
0
Order By: Relevance
“…[1,2] In the state estimation realm, the need for a mixture distribution may arise from stochastically switched models [3,4,5], multi-modal data/noise [6,7,8,9] and data association uncertainty [10,11,12]. The most known mixture is the Gaussian mixture [13,7], which consists of a finite number of Gaussian distributions. Recently, it has been further shown that the arithmetic average (AA) fusion which has provided a compelling approach to multi-target density fusion/consensus over sensor networks [14,15,16,17,18,19,20,21] will also result in a mixture distribution.…”
mentioning
confidence: 99%
“…[1,2] In the state estimation realm, the need for a mixture distribution may arise from stochastically switched models [3,4,5], multi-modal data/noise [6,7,8,9] and data association uncertainty [10,11,12]. The most known mixture is the Gaussian mixture [13,7], which consists of a finite number of Gaussian distributions. Recently, it has been further shown that the arithmetic average (AA) fusion which has provided a compelling approach to multi-target density fusion/consensus over sensor networks [14,15,16,17,18,19,20,21] will also result in a mixture distribution.…”
mentioning
confidence: 99%
“…• The estimation of unknown variables using the Kalman filter tends to be more accurate than that based on a single measurement [42]. • The Kalman Filter optimizes the estimation error; specifically, the mean squared error is minimized by it while considering systems with Gaussian noise [43]. • The Kalman filter has a promising ability to estimate (track) the system states (parameters) from noisy measurements [44].…”
Section: Kalman Filter Based Rsrp Estimationmentioning
confidence: 99%
“…Finite mixtures are flexible and powerful probabilistic modeling tools for both univariate and multivariate data, which have been well acknowledged and widely used for pattern recognition, machine learning, state estimation, etc. [1,2] In the state estimation realm, the need for a mixture distribution may arise from stochastically switched models [3,4,5], multi-modal data/noise [6,7,8,9] and data association uncertainty [10,11,12]. The most known mixture is the Gaussian mixture [13,7], which consists of a finite number of Gaussian distributions.…”
mentioning
confidence: 99%
“…Second, the linear fusion of a finite number of mixtures of the same parametric family remains a mixture of the same family. Therefore, the finite mixture has been one of the most important filter structures such as the known Gaussian mixture [13,7], Student's-t mixture [24] and multi-Bernoulli mixture of various forms [25,26,27]. There are many other types of mixture models such as Watson mixture model [28] for axially symmetric data, inverted Beta mixture model [29] for non-symmetric data, von-Mises Fisher mixture model [30] for directional data (such as bearing measurements).…”
mentioning
confidence: 99%