Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.733
|View full text |Cite
|
Sign up to set email alerts
|

Frugal Paradigm Completion

Abstract: Lexica distinguishing all morphologically related forms of each lexeme are crucial to many language technologies, yet building them is expensive. We propose Frugal Paradigm Completion, an approach that predicts all related forms in a morphological paradigm from as few manually provided forms as possible. It induces typological information during training which it uses to determine the best sources at test time. We evaluate our language-agnostic approach on 7 diverse languages. Compared to popular alternative a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…First, although the Sig-Morphon 2020 datasets are balanced by paradigm cell, real datasets are Zipfian, with sparse coverage of cells (Blevins et al, 2017;Lignos and Yang, 2018). For languages with large paradigms, the model thus requires the capacity to fill cells for which no exemplar can be retrieved, perhaps using a variant of adaptive source selection (Erdmann et al, 2020;Kann et al, 2017a). Second, the similar-exemplar model performs better in one-shot transfer experiments, but is hampered in the su- Finally, since the memory-based architecture is cognitively inspired, it might be adapted as a cognitive model of language learning in contact situations.…”
Section: Discussionmentioning
confidence: 99%
“…First, although the Sig-Morphon 2020 datasets are balanced by paradigm cell, real datasets are Zipfian, with sparse coverage of cells (Blevins et al, 2017;Lignos and Yang, 2018). For languages with large paradigms, the model thus requires the capacity to fill cells for which no exemplar can be retrieved, perhaps using a variant of adaptive source selection (Erdmann et al, 2020;Kann et al, 2017a). Second, the similar-exemplar model performs better in one-shot transfer experiments, but is hampered in the su- Finally, since the memory-based architecture is cognitively inspired, it might be adapted as a cognitive model of language learning in contact situations.…”
Section: Discussionmentioning
confidence: 99%