2020 43rd International Conference on Telecommunications and Signal Processing (TSP) 2020
DOI: 10.1109/tsp49548.2020.9163401
|View full text |Cite
|
Sign up to set email alerts
|

Expectation Propagation and Transparent Propagation in Iterative Signal Estimation in the Presence of Impulsive Noise

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…Exponential families have the pleasant property of being closed under the multiplication operation. Furthermore, it can be shown [20] that the projection operation in (23) simply reduces to equating the expectation of each feature with respect to the true distribution p(x), E p [G i (x)], with that computed with respect to the approximating distribution q(x), E q [G i (x)]. Such a procedure of matching the expectations (of the features) is at the heart of (and gives the name to) the celebrated EP (expectation propagation) algorithm [21], further applied in Section 4 to our problem.…”
Section: Kullback-leibler Divergencementioning
confidence: 99%
See 4 more Smart Citations
“…Exponential families have the pleasant property of being closed under the multiplication operation. Furthermore, it can be shown [20] that the projection operation in (23) simply reduces to equating the expectation of each feature with respect to the true distribution p(x), E p [G i (x)], with that computed with respect to the approximating distribution q(x), E q [G i (x)]. Such a procedure of matching the expectations (of the features) is at the heart of (and gives the name to) the celebrated EP (expectation propagation) algorithm [21], further applied in Section 4 to our problem.…”
Section: Kullback-leibler Divergencementioning
confidence: 99%
“…where ∑ i α i = 1 for normalization. We shall approximate it by the nearest Gaussian q(x) in the sense of the KL divergence, i.e., by applying (23). For that purpose, given that the expectations of the features of a Gaussian distributions are simply the mass, the mean, and the mean-squared value of that Gaussian (i.e., the expectations of one, x and x 2 ), the matching of these moments computed under p(x) with those computed under q(x) simply amounts to equating the mean and variance of the two distributions (assuming that both p(x) and q(x) are normalized).…”
Section: Kullback-leibler Divergencementioning
confidence: 99%
See 3 more Smart Citations