Complexity, Isolation, and Variation 2016
DOI: 10.1515/9783110348965-004
|View full text |Cite
|
Sign up to set email alerts
|

An information-theoretic approach to assess linguistic complexity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
55
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 64 publications
(56 citation statements)
references
References 0 publications
1
55
0
Order By: Relevance
“…Displacing the distribution to the left or skewing it towards low values would compromise expressivity. In contrast, skewing it towards the right increases the potential for expressivity according to Equation (13), though this comes with a learnability cost.…”
Section: Entropy Diversity Across Languages Of the Worldmentioning
confidence: 99%
See 2 more Smart Citations
“…Displacing the distribution to the left or skewing it towards low values would compromise expressivity. In contrast, skewing it towards the right increases the potential for expressivity according to Equation (13), though this comes with a learnability cost.…”
Section: Entropy Diversity Across Languages Of the Worldmentioning
confidence: 99%
“…For example, several recent studies engage in establishing information-theoretic and corpus-based methods for linguistic typology, i.e., classifying and comparing languages according to their information encoding potential [10,[12][13][14][15][16], and how this potential evolves over time [17][18][19]. Similar methods have been applied to compare and distinguish non-linguistic sequences from written language [20,21], though it is controversial whether this helps with more fine-grained distinctions between symbolic systems and written language [22,23].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, several recent studies engage in establishing information-theoretic and corpus-based methods for linguistic typology, i.e. classifying and comparing languages according to their information encoding potential [10,[12][13][14][15][16], and how this potential evolves over time [17][18][19]. Similar methods have been applied to compare and distinguish non-linguistic sequences from written 2 of 34 language [20,21], though it is controversial whether this helps with more fine-grained distinctions between symbolic systems and written language [22,23].…”
Section: Introductionmentioning
confidence: 99%
“…In the context of quantitative linguistics, entropic measures are used to understand laws in natural languages, such as the relationship between word frequency, predictability and the length of words [24][25][26][27], or the trade-off between word structure and sentence structure [10,13,28]. Information-theory can further help to understand the complexities involved when building words from the smallest meaningful units, i.e.…”
Section: Introductionmentioning
confidence: 99%