Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.310
|View full text |Cite
|
Sign up to set email alerts
|

Towards User-Driven Neural Machine Translation

Abstract: A good translation should not only translate the original content semantically, but also incarnate personal traits of the original text. For a real-world neural machine translation (NMT) system, these user traits (e.g., topic preference, stylistic characteristics and expression habits) can be preserved in user behavior (e.g., historical inputs). However, current NMT systems marginally consider the user behavior due to: 1) the difficulty of modeling user portraits in zero-shot scenarios, and 2) the lack of user… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

4
4

Authors

Journals

citations
Cited by 12 publications
(13 citation statements)
references
References 27 publications
0
13
0
Order By: Relevance
“…Pham et al (2019) change existing word embeddings to have domain-specific features which are activated or deactivated for a given input, improving lexical choice. Lin et al (2021) cache keywords for individual users with dedicated subnetworks, effectively tracking user 'domain'-specific vocabulary embeddings to combine with the generic embeddings. Sato et al (2020) replace generic translation word embeddings with domain-specific vocabulary embeddings learned by domain-specific language models from monolingual data.…”
Section: Domain-specific Subnetworkmentioning
confidence: 99%
“…Pham et al (2019) change existing word embeddings to have domain-specific features which are activated or deactivated for a given input, improving lexical choice. Lin et al (2021) cache keywords for individual users with dedicated subnetworks, effectively tracking user 'domain'-specific vocabulary embeddings to combine with the generic embeddings. Sato et al (2020) replace generic translation word embeddings with domain-specific vocabulary embeddings learned by domain-specific language models from monolingual data.…”
Section: Domain-specific Subnetworkmentioning
confidence: 99%
“…Contrastive Learning has been a widely-used technique to learn representations in both computer vision (Hjelm et al 2019;Khosla et al 2020;Chen et al 2020) and neural language processing (Logeswaran and Lee 2018;Fang et al 2020;Gao, Yao, and Chen 2021;Lin et al 2021). There are also several recent literatures that attempt to boost machine translation with the effectiveness of contrastive learning.…”
Section: Related Workmentioning
confidence: 99%
“…Diverse NMT. Improving translation diversity has been a hot topic in NMT community in recent years, such as lattice-based NMT (Su et al, 2017;Tan et al, 2018) and personalized NMT (Michel and Neubig, 2018;Lin et al, 2021). Existing works for diverse NMT can be categorized into three major categories.…”
Section: Related Workmentioning
confidence: 99%