2020
DOI: 10.1609/aaai.v34i05.6348
|View full text |Cite
|
Sign up to set email alerts
|

Global Greedy Dependency Parsing

Abstract: Most syntactic dependency parsing models may fall into one of two categories: transition- and graph-based models. The former models enjoy high inference efficiency with linear time complexity, but they rely on the stacking or re-ranking of partially-built parse trees to build a complete parse tree and are stuck with slower training for the necessity of dynamic oracle training. The latter, graph-based models, may boast better performance but are unfortunately marred by polynomial time inference. In this paper, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
6
3
1

Relationship

4
6

Authors

Journals

citations
Cited by 29 publications
(10 citation statements)
references
References 23 publications
0
10
0
Order By: Relevance
“…Recently, dependency syntactic parsing have been further developed with neural network and attained new state-ofthe-art results (Zhang, Zhao, and Qin 2016;Li et al 2018;Ma et al 2018;Li, Zhao, and Parnow 2020). Benefiting from the highly accurate parser, neural network models could enjoy even higher accuracy gains by leveraging syntactic information rather than ignoring it (Chen et al 2017a;2017b;Duan et al 2019).…”
Section: Syntactic Structuresmentioning
confidence: 99%
“…Recently, dependency syntactic parsing have been further developed with neural network and attained new state-ofthe-art results (Zhang, Zhao, and Qin 2016;Li et al 2018;Ma et al 2018;Li, Zhao, and Parnow 2020). Benefiting from the highly accurate parser, neural network models could enjoy even higher accuracy gains by leveraging syntactic information rather than ignoring it (Chen et al 2017a;2017b;Duan et al 2019).…”
Section: Syntactic Structuresmentioning
confidence: 99%
“…In this work, we develop an Open IE system by adapting a modified span selection model which has been applied in Semantic Role Labeling (SRL) (Ouchi, Shindo, and Matsumoto 2018;He et al 2018b;Li et al 2019;Zhao, Zhang, and Kit 2013;Zhao, Chen, and Kit 2009;Li et al 2018b;Cai et al 2018), coreference resolution (Shou and Zhao 2012;Zhang, Wu, and Zhao 2012) and syntactic parsing (Zhang, Zhao, and Qin 2016;Ma and Zhao 2012;Li, Zhao, and Parnow 2020;Zhou and Zhao 2019;Li et al 2018a). The advantage of a span model is that span level features can be sufficiently exploited which is hard to perform in token based sequence labeling models.…”
Section: Sentencementioning
confidence: 99%
“…Relation Encoder converts syntax tree of input text into a syntax graph that describes a global relationship among the involved input tokens. Syntactic dependency parse tree [26,27] is one of the traditional ways to describe linguistic dependency relation among words. In a tree structure, only words that are directed related in a sentence are connected.…”
Section: Relation Encodermentioning
confidence: 99%