2021 16th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP) 2021
DOI: 10.1109/isai-nlp54397.2021.9678159
|View full text |Cite
|
Sign up to set email alerts
|

A Study of Levenshtein Transformer and Editor Transformer Models for Under-Resourced Languages

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…We can implement the EDITOR model with soft lexical constraints or hard lexical constraints or without constraints. The authors in [46] proposed the EDITOR and Levenshtein Transformer models for two-way translation of Thai–Myanmar, Thai–English, and Myanmar–English language pairs without using any lexical constraints, and they showed that the EDITOR model achieves better translation performance in English-to-Thai, Thai-to-English, and English-to-Myanmar translation pairs. Moreover, Xu and Carpuat [44] stated that EDITOR performs more effectively than the Levenshtein Transformer when utilizing the soft lexical constraints.…”
Section: Methodsmentioning
confidence: 99%
“…We can implement the EDITOR model with soft lexical constraints or hard lexical constraints or without constraints. The authors in [46] proposed the EDITOR and Levenshtein Transformer models for two-way translation of Thai–Myanmar, Thai–English, and Myanmar–English language pairs without using any lexical constraints, and they showed that the EDITOR model achieves better translation performance in English-to-Thai, Thai-to-English, and English-to-Myanmar translation pairs. Moreover, Xu and Carpuat [44] stated that EDITOR performs more effectively than the Levenshtein Transformer when utilizing the soft lexical constraints.…”
Section: Methodsmentioning
confidence: 99%