Proceedings of The 2018
DOI: 10.18653/v1/k18-2013
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: In this paper we describe the TurkuNLP entry at the CoNLL 2018 Shared Task on Multilingual Parsing from Raw Text to Universal Dependencies. Compared to the last year, this year the shared task includes two new main metrics to measure the morphological tagging and lemmatization accuracies in addition to syntactic trees. Basing our motivation into these new metrics, we developed an end-to-end parsing pipeline especially focusing on developing a novel and state-of-the-art component for lemmatization. Our system r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(14 citation statements)
references
References 9 publications
0
14
0
Order By: Relevance
“…Omorfi uses finite state transducers and can be used further for enhancing lemmatization (Silfverberg et al, 2016). The current state-of-the-art is represented by Turku neural parser pipeline (Kanerva et al, 2018), (Kanerva et al, 2019) that treats lemmatization as a sequence-to-sequence problem using the OpenNMT neural machine translation toolkit (Klein et al, 2017) and yields 95%-97% accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…Omorfi uses finite state transducers and can be used further for enhancing lemmatization (Silfverberg et al, 2016). The current state-of-the-art is represented by Turku neural parser pipeline (Kanerva et al, 2018), (Kanerva et al, 2019) that treats lemmatization as a sequence-to-sequence problem using the OpenNMT neural machine translation toolkit (Klein et al, 2017) and yields 95%-97% accuracy.…”
Section: Related Workmentioning
confidence: 99%
“…In particular the Bi-LSTM-based deep biaffine neural dependency parser by Dozat and Manning (2017) has been quite popular and used in three out of five of the top submissions to the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies (Zeman et al, 2018), in particular in the top non-ensemble submission (Kanerva et al, 2018).…”
Section: Syntactic Parsingmentioning
confidence: 99%
“…Parser Analyses and Comparisons. Recent years have seen a wide range of studies comparing different language models for dependency parsing (e.g., Kanerva et al, 2018;Smith et al, 2018). Additionally, several studies have investigated the amount of implicit syntactic information captured in pre-trained LMs such as ELMo and BERT (Tenney et al, 2019a,b;Hewitt and Manning, 2019).…”
Section: Related Workmentioning
confidence: 99%
“…(2) the rise of pre-trained distributed word representations, particularly transformer-based contextualized embeddings such as BERT (Devlin et al, 2019) or RoBERTa (Liu et al, 2019). Both characteristics are present in recent top-performing systems (Che et al, 2018;Kondratyuk and Straka, 2019;Kanerva et al, 2018Che et al, 2018). However, there remain a considerable number of implementation and configuration choices whose impact on parser performance is less well understood.…”
Section: Introductionmentioning
confidence: 99%