Proceedings of the CoNLL SIGMORPHON 2017 Shared Task: Universal Morphological Reinflection 2017
DOI: 10.18653/v1/k17-2004
|View full text |Cite
|
Sign up to set email alerts
|

Align and Copy: UZH at SIGMORPHON 2017 Shared Task for Morphological Reinflection

Abstract: This paper presents the submissions by the University of Zurich to the SIGMOR-PHON 2017 shared task on morphological reinflection. The task is to predict the inflected form given a lemma and a set of morpho-syntactic features. We focus on neural network approaches that can tackle the task in a limited-resource setting. As the transduction of the lemma into the inflected form is dominated by copying over lemma characters, we propose two recurrent neural network architectures with hard monotonic attention that a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
63
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(63 citation statements)
references
References 14 publications
0
63
0
Order By: Relevance
“…Another idea relevant to explore in future work is to consider the networks that are designed to be strong at character copying which is the most common operation in string transduction tasks such as morphological segmentation, morphological reinflection and normalization (Gu et al, 2016;See et al, 2017;Makarov et al, 2017).…”
Section: Discussionmentioning
confidence: 99%
“…Another idea relevant to explore in future work is to consider the networks that are designed to be strong at character copying which is the most common operation in string transduction tasks such as morphological segmentation, morphological reinflection and normalization (Gu et al, 2016;See et al, 2017;Makarov et al, 2017).…”
Section: Discussionmentioning
confidence: 99%
“…For the SIGMORPHON 2016 and the CoNLL-SIGMORPHON 2017 shared tasks (Cotterell et al, , 2017, multiple MRI systems were developed, e.g., (Nicolai et al, 2016;Taji et al, 2016;Kann and Schütze, 2016;Östling, 2016;Makarov et al, 2017). Encoder-decoder neural networks (Cho et al, 2014a; performed best, such that we extend them in this work.…”
Section: Related Workmentioning
confidence: 91%
“…The choice of attention has a smaller impact. In both data settings, our best model on the validation set outperforms all submissions from the 2018 shared task except for UZH (Makarov and Clematide, 2018), which uses a more involved imitation learning approach and larger ensembles. In contrast, our only departure from standard seq2seq training is the drop-in replacement of softmax by entmax.…”
Section: Morphological Inflectionmentioning
confidence: 94%