2017
DOI: 10.1080/09296174.2017.1366095
|View full text |Cite
|
Sign up to set email alerts
|

Optimization Models of Natural Communication

Abstract: A family of information theoretic models of communication was introduced more than a decade ago to explain the origins of Zipf's law for word frequencies. The family is a based on a combination of two information theoretic principles: maximization of mutual information between forms and meanings and minimization of form entropy. The family also sheds light on the origins of three other patterns: the principle of contrast, a related vocabulary learning bias and the meaning-frequency law. Here two important comp… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
2
1

Relationship

5
3

Authors

Journals

citations
Cited by 33 publications
(37 citation statements)
references
References 82 publications
0
36
0
1
Order By: Relevance
“…The reason for (1) is that our theory (e.g., Q) assumes a tree structure [19,28] and that we wanted to avoid the statistical problem of mixing trees with other kinds of graphs, e.g., the potential number of crossings depends on the number of edges [19,27,63]. The reason for (2) is that crossings are impossible in a star tree [63]. Condition (2) implies that the syntactic dependency structure has at least four vertices (otherwise all the possible trees are star trees).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The reason for (1) is that our theory (e.g., Q) assumes a tree structure [19,28] and that we wanted to avoid the statistical problem of mixing trees with other kinds of graphs, e.g., the potential number of crossings depends on the number of edges [19,27,63]. The reason for (2) is that crossings are impossible in a star tree [63]. Condition (2) implies that the syntactic dependency structure has at least four vertices (otherwise all the possible trees are star trees).…”
Section: Methodsmentioning
confidence: 99%
“…Here we provide a quick overview of a crossing theory developed in a series of articles [18,19,27,28,63]. It is correct to state that C cannot exceed the number of pairs of different edges, namely…”
Section: Crossing Theorymentioning
confidence: 99%
“…Zipf's law is an example of power-law model for the relationship between two variables [4]. Zipf's law for word frequencies can be explained by information theoretic models of communication [5] and is a robust pattern of language that presents invariance with text length in a sufficiently long text [6], and little sensitivity with respect to the linguistic units considered [7]. The focus of this paper is to test the robustness of two statistical laws in linguistics that have been studied less intensively:…”
Section: Introductionmentioning
confidence: 99%
“…Zipf's law for word frequencies, for instance, emerges in a critical balance between these two forces [39]. In these models, entropy minimization is linked with learnability: fewer word forms are easier to learn (see [80] for other cognitive costs associated with entropy). Whereas mutual information maximization is linked with expressivity via the form/meaning mappings available in a communication system.…”
Section: Entropy Diversity Across Languages Of the Worldmentioning
confidence: 99%