Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.399
|View full text |Cite
|
Sign up to set email alerts
|

Domain Transfer based Data Augmentation for Neural Query Translation

Abstract: Query translation (QT) serves as a critical factor in successful cross-lingual information retrieval (CLIR). Due to the lack of parallel query samples, neural-based QT models are usually optimized with synthetic data which are derived from large-scale monolingual queries. Nevertheless, such kind of pseudo corpus is mostly produced by a general-domain translation model, making it be insufficient to guide the learning of QT model. In this paper, we extend the data augmentation with a domain transfer procedure, t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
4

Relationship

6
4

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 28 publications
0
7
0
1
Order By: Relevance
“…It is a promising direction to explore other behavior in future, such as clickthrough and editing operations. Moreover, following recent advancements in domain adaptation for NMT, we plan to further improve our model via adversial training based knowledge transfer (Zeng et al, 2018;Yao et al, 2020;Su et al, 2021) and dual knowledge transfer (Zeng et al, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…It is a promising direction to explore other behavior in future, such as clickthrough and editing operations. Moreover, following recent advancements in domain adaptation for NMT, we plan to further improve our model via adversial training based knowledge transfer (Zeng et al, 2018;Yao et al, 2020;Su et al, 2021) and dual knowledge transfer (Zeng et al, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…Personalized Machine Translation Recently, some researchers have employed domain adaptation (Zhang et al, 2019;Gururangan et al, 2020;Yao et al, 2020) to generate personalized translations. For example, Mirkin et al (2015) show that the translation generated by the SMT model has an adverse effect on the prediction of author personalities, demonstrating the necessity of personalized machine translation.…”
Section: Related Workmentioning
confidence: 99%
“…[79] Sampling Rules [26], [28], [68], [81], [100], [105] [80], [106], [107] [108] Seq2Seq [28], [29], [86], [109] [13], [85], [110], [111], [84] [86]…”
Section: Textunclassified