2016
DOI: 10.1109/tit.2016.2604248
|View full text |Cite
|
Sign up to set email alerts
|

Entropy and Source Coding for Integer-Dimensional Singular Random Variables

Abstract: Entropy and differential entropy are important quantities in information theory. A tractable extension to singular random variables-which are neither discrete nor continuoushas not been available so far. Here, we present such an extension for the practically relevant class of integer-dimensional singular random variables. The proposed entropy definition contains the entropy of discrete random variables and the differential entropy of continuous random variables as special cases. We show that it transforms in a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
16
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 16 publications
(16 citation statements)
references
References 27 publications
(99 reference statements)
0
16
0
Order By: Relevance
“…A slightly weaker version of this statement, valid for Lipschitz mappings, was derived previously in [23,Lemma 4].…”
Section: Rectifiable Sets and Rectifiable Random Vectorsmentioning
confidence: 92%
See 1 more Smart Citation
“…A slightly weaker version of this statement, valid for Lipschitz mappings, was derived previously in [23,Lemma 4].…”
Section: Rectifiable Sets and Rectifiable Random Vectorsmentioning
confidence: 92%
“…We particularize our achievability result to s-rectifiable random vectors x as introduced in [23]; these are random vectors supported on countable unions of s-dimensional C 1submanifolds of R m and of absolutely continuous-with respect to s-dimensional Hausdorff measure-distribution. Countable unions of C 1 -submanifolds include numerous signal models prevalent in the compressed sensing literature, namely, the standard union of subspaces model underlying much of compressed sensing theory [24], [25] and spectrumblind sampling [26], [27], smooth submanifolds [28], blocksparsity [29]- [31], and low-rank matrices as considered in the matrix completion problem [32]- [34].…”
mentioning
confidence: 99%
“…This was introduced by Csiszár in [5], see also Eq. ( 8) in [7]. Remark that the set where f = 0, hence log(f ) = −∞, is ρ-negligible.…”
Section: Definition and Aepmentioning
confidence: 99%
“…where (40) is by [49,Equation (3.6)] and in (42) we used (33). We can hence conclude that γ p,d is ρ-subregular of dimension d − p. The corresponding subregularity constants are…”
mentioning
confidence: 96%
“…For continuous X of finite differential entropy under the difference distortion function ρ(x, y) = x − y k , where • is a semi-norm and k ∈ (0, ∞), the Shannon lower bound is known explicitly [63, Section VI] and, provided that X additionally satisfies a certain moment constraint, is tight as D → 0 [43,39]. For the class of m-rectifiable random variables [40,Definition 11], a Shannon lower bound was reported recently in [40,Theorem 55]. This bound is, however, not in explicit form and depends on a parametrized integral.…”
mentioning
confidence: 99%