2013
DOI: 10.1162/coli_a_00125
|View full text |Cite
|
Sign up to set email alerts
|

Mildly Non-Projective Dependency Grammar

Abstract: Syntactic representations based on word-to-word dependencies have a long-standing tradition in descriptive linguistics, and receive considerable interest in many applications. Nevertheless, dependency syntax has remained somewhat of an island from a formal point of view. Moreover, most formalisms available for dependency grammar are restricted to projective analyses, and thus not able to support natural accounts of phenomena such as wh-movement and cross–serial dependencies. In this article we present a formal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
35
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 34 publications
(35 citation statements)
references
References 36 publications
0
35
0
Order By: Relevance
“…Another pervasive property of languages is projectivity, the property that, in linearizations of dependency graphs, the lines connecting heads and dependents do not cross (18). Ferrer i Cancho (19) has argued that this ubiquitous property of languages arises from dependency length minimization, because orders that minimize dependency length have a small number of crossing dependencies on average.…”
Section: Significancementioning
confidence: 99%
“…Another pervasive property of languages is projectivity, the property that, in linearizations of dependency graphs, the lines connecting heads and dependents do not cross (18). Ferrer i Cancho (19) has argued that this ubiquitous property of languages arises from dependency length minimization, because orders that minimize dependency length have a small number of crossing dependencies on average.…”
Section: Significancementioning
confidence: 99%
“…Producing these strings requires a stack-like data structure where some number of as are pushed onto the stack so that the same number of bs can be popped from it. The hierarchical structures of natural language are widely believed to be mildly context-sensitive (Shieber, 1985;Weir, 1988;Seki et al, 1991;Joshi and Schabes, 1997;Kuhlmann, 2013), so this result shows that LSTMs are practically capable of inducing the proper data structures to handle the hierarchical structure of natural language. What remains to be seen in a general way is that LSTMs induce and use these structures when trained on natural language input, rather than artificial language input.…”
Section: Introductionmentioning
confidence: 94%
“…2-Crossing Interval trees are not necessarily well-nested and can have unbounded block degree (Kuhlmann, 2013). Figure 2 shows an example of a 2-Crossing Interval tree (all crossed edges are incident to either a or b; no children are on the far side of their parent) in which the subtrees rooted at a and b are ill-nested and each has a block degree of n + 1.…”
Section: K-crossing Interval Treesmentioning
confidence: 99%