Proceedings DCC 2000. Data Compression Conference
DOI: 10.1109/dcc.2000.838152
|View full text |Cite
|
Sign up to set email alerts
|

Implementing the context tree weighting method for text compression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(21 citation statements)
references
References 12 publications
0
21
0
Order By: Relevance
“…It makes use of Move-To-Front (MTF) [1] and an entropy coder as the backend compressor. Unceasing efforts are on to improve the efficiency of PPM [8,9,17] and BWT [1,3,19].…”
Section: Introductionmentioning
confidence: 99%
“…It makes use of Move-To-Front (MTF) [1] and an entropy coder as the backend compressor. Unceasing efforts are on to improve the efficiency of PPM [8,9,17] and BWT [1,3,19].…”
Section: Introductionmentioning
confidence: 99%
“…The compression results also show an improvement in the range of around 2% to 7% for compression methods with LIPT over the best results obtained by various modifications in BWT [1,2,5,18] and PPM [6,7,17,19]. This comes at the expense of some storage overhead whose amortized cost is shown to be negligible.…”
Section: Figure 1: Frequency Of Words Versus Length Of Words In Our Tmentioning
confidence: 55%
“…A number of efforts have been made to reduce the time for PPM and also to improve the compression ratio. Sadakane, Okazaki, and Imai [17] have given a method where they have combined PPM and CTW [19] to get better compression. Effros [7] has given a new implementation of PPM* with the complexity of BWT.…”
Section: Comparison With Recent Improvements Of Bwt and Ppmmentioning
confidence: 99%
See 1 more Smart Citation
“…Unfortunately, it does not fit the mixture model scheme of Section 1.2. The latter approach was introduced as an implementation technique [105,78] and received less attention. However, it resembles the mixture model scheme of Section 1.2.…”
Section: Context Tree Weightingmentioning
confidence: 99%