2018 52nd Asilomar Conference on Signals, Systems, and Computers 2018
DOI: 10.1109/acssc.2018.8645180
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Cascaded Signal Processing Operations Using Entropy Rate Power

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…An interesting quantity introduced and analyzed by Gibson in a series of papers is the log ratio of entropy powers [5,8,9]. Specifically, the log ratio of entropy powers is related to the difference in mutual information, and further, in many cases, the entropy powers can be replaced with the minimum mean squared prediction error (MMSPE) in the ratio.…”
Section: A Mutual Information Decompositionmentioning
confidence: 99%
See 2 more Smart Citations
“…An interesting quantity introduced and analyzed by Gibson in a series of papers is the log ratio of entropy powers [5,8,9]. Specifically, the log ratio of entropy powers is related to the difference in mutual information, and further, in many cases, the entropy powers can be replaced with the minimum mean squared prediction error (MMSPE) in the ratio.…”
Section: A Mutual Information Decompositionmentioning
confidence: 99%
“…Gibson [5] used Equation (17) to investigate the change in mutual information as the predictor order, denoted in the following by N, is increased for different speech frames. Based on several analyses of the MMSPE and the fact that the log ratio of entropy powers can be replaced with the log ratio of MMSPEs for several different distributions, as outlined in Section 6 and in [9], we can use the expression…”
Section: Mutual Information In the Short-term Prediction Componentmentioning
confidence: 99%
See 1 more Smart Citation