2013
DOI: 10.1201/b15991
|View full text |Cite
|
Sign up to set email alerts
|

Handbook of Differential Entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

2
99
0
3

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 101 publications
(104 citation statements)
references
References 0 publications
2
99
0
3
Order By: Relevance
“…There is no closed form solution to Rician entropy [6], and when added to a Gaussian, the measurement distribution becomes even more complex. Using our method, we modeled first a zeroradius Rician, which is equivalent to a Rayleigh.…”
Section: A Radar Rangementioning
confidence: 99%
See 1 more Smart Citation
“…There is no closed form solution to Rician entropy [6], and when added to a Gaussian, the measurement distribution becomes even more complex. Using our method, we modeled first a zeroradius Rician, which is equivalent to a Rayleigh.…”
Section: A Radar Rangementioning
confidence: 99%
“…However, even for simple scenarios such as independent additive white Gaussian noise (AWGN), the resulting distributions can be complicated. Except for Gaussian source distributions, there are few closed form entropy expressions for independent additive random variables [6]. In addition, simulation or histogram methods can suffer from large biases [7].…”
Section: Introductionmentioning
confidence: 99%
“…The range, in general, is Rice distributed [10]. Unfortunately, there is no closed form solution for the differential entropy of a Rician [11]. However, we can compute this entropy using numerical integration or Monte Carlo methods [12].…”
Section: Measurement Model / Measurement Entropymentioning
confidence: 99%
“…The application to the prediction of wind speed indicates that the variances of resampled time series increase almost exponentially with the increase of resampling period. Entropy in information theory is a powerful conception to describing the indeterminacy of a non-stationary time series [1,2] , and it has been used in a wide range of fields [3][4][5][6][7][8][9][10][11][12] . Its estimators from time series have also been studied extensively [13] .…”
mentioning
confidence: 99%
“…Entropy in information theory is a powerful conception to describing the indeterminacy of a non-stationary time series [1,2] , and it has been used in a wide range of fields [3][4][5][6][7][8][9][10][11][12] . Its estimators from time series have also been studied extensively [13] .…”
mentioning
confidence: 99%