1976
DOI: 10.1109/tit.1976.1055498
|View full text |Cite
|
Sign up to set email alerts
|

First second- and third-order entropies of Arabic text (Corresp.)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

1978
1978
2016
2016

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 1 publication
0
6
0
Order By: Relevance
“…Both joint entropy per character and conditional entropy show a downward trend with increased number of orders. 1 st -3 rd order entropies are more than that of English text [14], this should be because of more number and type of characters that Thai text contains.…”
Section: Criteria For the Performance Comparisonmentioning
confidence: 93%
“…Both joint entropy per character and conditional entropy show a downward trend with increased number of orders. 1 st -3 rd order entropies are more than that of English text [14], this should be because of more number and type of characters that Thai text contains.…”
Section: Criteria For the Performance Comparisonmentioning
confidence: 93%
“…After that, many other papers used Shannon's method to calculate the entropy of English on different passages and context lengths [19][20][21][22]. Other papers use Shannon's technique to measure the entropy of other languages [23][24][25][26][27]. Shannon's method was modified by Cover and King [28] who asked their subjects to gamble on the next character.…”
Section: Related Workmentioning
confidence: 99%
“…Kontoyiannis reported a value of 1.77 bits per character using a method based on string matching, resembling that of universal coding based on a one milloin character sample from a same author [5]. In Arabic language, Wanas et al calculated the first, second, and third order entropies of written Arabic text [6]. In Telugu, P.Balasubrahmanyam et al constructed an optimum code for the Telugu alphabet and estimates of one gram entropies of the different forms of prose writings were also obtained [7].…”
Section: Introductionmentioning
confidence: 99%