2019
DOI: 10.1109/tit.2019.2923091
|View full text |Cite
|
Sign up to set email alerts
|

Lossless Analog Compression

Abstract: We establish the fundamental limits of lossless analog compression by considering the recovery of arbitrary random vectors x ∈ R m from the noiseless linear measurements y = Ax with measurement matrix A ∈ R n×m . Our theory is inspired by the groundbreaking work of Wu and Verdú (2010) on almost lossless analog compression, but applies to the nonasymptotic, i.e., fixed-m case, and considers zero error probability. Specifically, our achievability result states that, for Lebesgue-almost all A, the random vector x… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 13 publications
(11 citation statements)
references
References 53 publications
1
10
0
Order By: Relevance
“…However, it does not provide Hölder inverse and argues that it cannot exist in general in the probabilistic context. Similar results for the modied lower box-counting dimension have been obtained also in [44].…”
Section: This Follows From the Observation That Conditionsupporting
confidence: 84%
See 1 more Smart Citation
“…However, it does not provide Hölder inverse and argues that it cannot exist in general in the probabilistic context. Similar results for the modied lower box-counting dimension have been obtained also in [44].…”
Section: This Follows From the Observation That Conditionsupporting
confidence: 84%
“…Applying Jensen's inequality (see e.g. [31,Theorem 3.3]) yields (44). As k 0 ≤ n 0 , inequality (45) follows.…”
Section: Then For Anymentioning
confidence: 99%
“…3.4]), we can assume, w.l.o.g., that U is compact. 5 For the detailed arguments leading to this statement, we refer to [22].…”
Section: Proof Of Theoremmentioning
confidence: 99%
“…This paper is concerned with the lossy compression of general random variables, specifically with rate-distortion (R-D) theory and quantization of random variables taking values in general measurable spaces such as, e.g., manifolds and fractal sets. Manifold structures are prevalent in data science, e.g., in compressed sensing [4,9,8,1,54,11], machine learning [21,45], image processing [44,57], directional statistics [47], and handwritten digit recognition [31]. Fractal sets find application in image compression and in modeling of Ethernet traffic [41].…”
mentioning
confidence: 99%
“…In R-D theory [56,6,28,29,15], one is interested in the characterization of the ultimate limits on the compression of sequences of random variables under a distortion constraint, here expressed in terms of expected average distortion. Specifically, let (A, A ) and (B, B) be measurable spaces equipped with a measurable function σ : A × B → [0, ∞], henceforth referred to as distortion function, 1 and let (X i ) i∈N be a sequence of random variables taking values in (A, A ). For every ℓ ∈ N, measurable mappings g (ℓ) : A ℓ → B ℓ with |g (ℓ) (A ℓ )| < ∞ are referred to as source codes of length ℓ.…”
mentioning
confidence: 99%