2013
DOI: 10.1117/1.jrs.7.074597
|View full text |Cite
|
Sign up to set email alerts
|

Information-theoretic assessment of on-board near-lossless compression of hyperspectral data

Abstract: Abstract.A rate-distortion model to measure the impact of near-lossless compression of raw data, that is, compression with user-defined maximum absolute error, on the information available once the compressed data have been received and decompressed is proposed. Such a model requires the original uncompressed raw data and their measured noise variances. Advanced nearlossless methods are exploited only to measure the entropy of the datasets but are not required for on-board compression. In substance, the acquir… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2019
2019
2019
2019

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…In the case of DPCM-like strategies, the possibility to achieve a lossy compression with the additional feature of a peak error controlled by the user is also achievable, by inserting a quantization step in the prediction loop. This is the so-called near-lossless compression, which can be obtained by means of both causal and no-causal schemes [22,23]. Concerning the encoding stage, the arithmetic coding can reach the entropy limit [24], but it is not exploitable for an on-board implementation.…”
Section: On-board Data Compressionmentioning
confidence: 99%
“…In the case of DPCM-like strategies, the possibility to achieve a lossy compression with the additional feature of a peak error controlled by the user is also achievable, by inserting a quantization step in the prediction loop. This is the so-called near-lossless compression, which can be obtained by means of both causal and no-causal schemes [22,23]. Concerning the encoding stage, the arithmetic coding can reach the entropy limit [24], but it is not exploitable for an on-board implementation.…”
Section: On-board Data Compressionmentioning
confidence: 99%