Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-1077
|View full text |Cite
|
Sign up to set email alerts
|

Viable Dependency Parsing as Sequence Labeling

Abstract: We recast dependency parsing as a sequence labeling problem, exploring several encodings of dependency trees as labels. While dependency parsing by means of sequence labeling had been attempted in existing work, results suggested that the technique was impractical. We show instead that with a conventional BIL-STM-based model it is possible to obtain fast and accurate parsers. These parsers are conceptually simple, not needing traditional parsing algorithms or auxiliary structures. However, experiments on the P… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
100
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

3
5

Authors

Journals

citations
Cited by 59 publications
(101 citation statements)
references
References 23 publications
1
100
0
Order By: Relevance
“…This also opens the question of whether a different architecture could better suit the purpose of leveraging the gaze information in a consistent way. In this context, a potential line of work could adapt human-attention approaches (Barrett et al, 2018) for structured prediction and word-level classification, although it would come at a cost of speed for parsing as sequence labeling (Strzyz et al, 2019b).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…This also opens the question of whether a different architecture could better suit the purpose of leveraging the gaze information in a consistent way. In this context, a potential line of work could adapt human-attention approaches (Barrett et al, 2018) for structured prediction and word-level classification, although it would come at a cost of speed for parsing as sequence labeling (Strzyz et al, 2019b).…”
Section: Discussionmentioning
confidence: 99%
“…Dependency parsing as sequence labeling We proceed similarly to Strzyz et al (2019b). Given a linearization function F w :…”
Section: Gaze-averaged Sequence Labeling Parsingmentioning
confidence: 99%
See 1 more Smart Citation
“…Experiment 2 We used the best predictions (when using chunking) from experiment 1 as additional features for a sequence-labelling dependency parser (Strzyz et al, 2019). Therefore, network input consisted of word and character embedding and then some combination of POS tags, morphological feature tags, or chunk labels with the sole output being a dependency parser tag.…”
Section: Methodsmentioning
confidence: 99%
“…Figure 1a illustrates the encoding with an example. 1 Dependency parsing as tagging Strzyz et al (2019) also propose a linearization method Π |w| :…”
Section: Introductionmentioning
confidence: 99%