Proceedings of the 19th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology 2022
DOI: 10.18653/v1/2022.sigmorphon-1.22
|View full text |Cite
|
Sign up to set email alerts
|

OSU at SigMorphon 2022: Analogical Inflection With Rule Features

Abstract: OSU's inflection system is a transformer whose input is augmented with an analogical exemplar showing how to inflect a different word into the target cell. In addition, alignment-based heuristic features indicate how well the exemplar is likely to match the output. OSU's scores substantially improve over the baseline transformer for instances where an exemplar is available, though not quite matching the challenge winner. In Part 2, the system shows a tendency to over-apply the majority pattern in English, but … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 17 publications
0
5
0
Order By: Relevance
“…Additionally, data augmentation (Anastasopoulos and Neubig, 2019;Silfverberg et al, 2017) can also improve the performance of models in low-resource languages. Seq2seq models, such as RNN+attention (Wiemerslage et al, 2018) or Transformer (Yang et al, 2022;Merzhevich et al, 2022;Elsner and Court, 2022), have become popular framework for morphological inflection in recent years. The lemma and tags are usually input together in this framework, and the model generates the inflected word.…”
Section: Related Workmentioning
confidence: 99%
“…Additionally, data augmentation (Anastasopoulos and Neubig, 2019;Silfverberg et al, 2017) can also improve the performance of models in low-resource languages. Seq2seq models, such as RNN+attention (Wiemerslage et al, 2018) or Transformer (Yang et al, 2022;Merzhevich et al, 2022;Elsner and Court, 2022), have become popular framework for morphological inflection in recent years. The lemma and tags are usually input together in this framework, and the model generates the inflected word.…”
Section: Related Workmentioning
confidence: 99%
“…Additionally, data augmentation (Anastasopoulos and Neubig, 2019; Silfverberg et al, 2017) can also improve the performance of models in low-resource languages. Seq2seq models, such as RNN+attention (Wiemerslage et al, 2018) or Transformer Merzhevich et al, 2022;Elsner and Court, 2022), have become popular framework for morphological inflection in recent years. The lemma and tags are usually input…”
Section: Acknowledgementsmentioning
confidence: 99%
“…The authors reported their best average accuracy of 55.5% when the number of hallucinated data is 1000. Elsner and Court (2022) augment each training instance with a representative exemplar lemma and its output form. Training instances are also augmented with rule-based features constructed by aligning source lemma with its inflected form and exemplar lemma with its inflected form.…”
Section: Related Workmentioning
confidence: 99%