1997
DOI: 10.1093/comjnl/40.2_and_3.67
|View full text |Cite
|
Sign up to set email alerts
|

Unbounded Length Contexts for PPM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
186
0
2

Year Published

1998
1998
2015
2015

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 211 publications
(188 citation statements)
references
References 17 publications
0
186
0
2
Order By: Relevance
“…We did not try PPM£ [9], which applies unbounded contexts, although many researchers regard it as efficient. However, Bunton [5] reports that PPMC of order 5 is better than PPM£ in text compression.…”
Section: Discussionmentioning
confidence: 99%
“…We did not try PPM£ [9], which applies unbounded contexts, although many researchers regard it as efficient. However, Bunton [5] reports that PPMC of order 5 is better than PPM£ in text compression.…”
Section: Discussionmentioning
confidence: 99%
“…The language modelling involves multiple partial-predictive-match (PPM) models [1,2]. These code text very well, approaching the theoretical maximum compression for English texts (see [17]).…”
Section: Ppm Language Modelmentioning
confidence: 99%
“…A simple language model (based upon partial predictive matching, see [8,9]) is used to produce p(letter|prefix) (referred to as p(l|pr)) on a per-word basis. A tree with probability information is generated from a corpus (in this case texts from Project Gutenberg [10]).…”
Section: Probability Modelmentioning
confidence: 99%