2014 Data Compression Conference 2014
DOI: 10.1109/dcc.2014.65
|View full text |Cite
|
Sign up to set email alerts
|

Entropy Reduction Using Context Transformations

Abstract: There are two classes of algorithms used in data compression. The first class deals directly with compression itself and it represents algorithms such as Huffman coding, Lempel-Ziv family algorithms, PPM and others. In the second class, there are algorithms trying to transform data into one, that are more easily compressed by the first class, for example Burrows-Wheeler transform or MoveToFront Coding. We prepared a second class method that transforms input data into data with lower entropy.Consider an input m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2015
2015
2017
2017

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…Our recent studies were focused on a special class of grammar transforms that leave the message size intact [19,20]. In the present paper, the class of grammar transformations is extended with a novel concept of higher order context transformation [21].…”
Section: Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Our recent studies were focused on a special class of grammar transforms that leave the message size intact [19,20]. In the present paper, the class of grammar transformations is extended with a novel concept of higher order context transformation [21].…”
Section: Previous Workmentioning
confidence: 99%
“…The concept of context transformations was first proposed in [25], and the results were presented in [19]. It is the simplest transformation that assumes a pair of digrams beginning with the same symbol when one of the digrams is initially missing in the input message.…”
Section: Context Transformationmentioning
confidence: 99%
“…In our work we rely on this fact and we assume that Shannon's entropy of transformed data is approximately achievable. In our previous work about context transformations [2] and generalized context transformations [3] we studied, under what conditions the exchange of two different digrams, beginning with the same symbol, leads to the reduction of Shannon's entropy.…”
Section: Introductionmentioning
confidence: 99%
“…GCT is an improved version of the Context Transformation(CT) method presented recently [1]. GCT is used as a preprocessor for zero-order entropy coding algorithms like Arithmetic or Huffman coding.…”
mentioning
confidence: 99%