2022
DOI: 10.1007/978-981-19-7960-6_12
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 30 publications
0
1
0
Order By: Relevance
“…Training of the paraphrase model. ParaZh-22M (Hao et al 2022) is a large-scale paraphrase dataset with about 22M sentence pairs. We utilize the NER toolkit (TexSmart) to preprocess the dataset.…”
Section: The Construction Of Positive Samplesmentioning
confidence: 99%
“…Training of the paraphrase model. ParaZh-22M (Hao et al 2022) is a large-scale paraphrase dataset with about 22M sentence pairs. We utilize the NER toolkit (TexSmart) to preprocess the dataset.…”
Section: The Construction Of Positive Samplesmentioning
confidence: 99%
“…By configuring the output language to correspond with the input language, multilingual NMT can generate paraphrases directly, overcoming the first limitation. (2) Paraphrases generated from the paraphraser provide diversity in word selection while preserving the sentence's meaning [12,11]. The meaningpreserving properties of paraphrase models can aid in addressing the second limitation.…”
Section: Introductionmentioning
confidence: 99%