2012
DOI: 10.1016/j.jspi.2012.02.023
|View full text |Cite
|
Sign up to set email alerts
|

On the convergence of Shannon differential entropy, and its connections with density and entropy estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…Note that μθ converges point-wise to µ * = (1, 0, • • • ) as θ vanishes 5 . However, by the entropy discontinuity [17], [18], [23], the convergence of the measure to µ * is not sufficient to guarantee that lim θ→0 H(μ θ ) = H(µ * ) = 0.…”
Section: Proofmentioning
confidence: 99%
“…Note that μθ converges point-wise to µ * = (1, 0, • • • ) as θ vanishes 5 . However, by the entropy discontinuity [17], [18], [23], the convergence of the measure to µ * is not sufficient to guarantee that lim θ→0 H(μ θ ) = H(µ * ) = 0.…”
Section: Proofmentioning
confidence: 99%
“…It compares the entropy of a parent subspace with those of its children's subspaces to find out the optimum level of resolution using the optimum mother wavelet. The criterion states that if the entropy of a signal at a new level is higher than that of the previous level, the decomposition of the signal is not needed [42].…”
Section: Sending Endmentioning
confidence: 99%
“…On this basis, the use of information entropy is widespread in engineering applications. Different types of information entropy have been defined in accordance with their own usage, such as topological entropy of a given interval map [ 14 ], spatial entropy of pixels [ 15 ], weighted multiscale permutation entropy of nonlinear time series [ 16 ], Shannon differential entropy for distributions [ 17 ], min-and max-entropies [ 18 ], collision entropy [ 19 ], permutation entropy [ 20 ], time entropy [ 21 ], multiscale entropy [ 22 ], wavelet entropy [ 23 ] and so on.…”
Section: Introductionmentioning
confidence: 99%