Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.325
|View full text |Cite
|
Sign up to set email alerts
|

Lexically Constrained Neural Machine Translation with Levenshtein Transformer

Abstract: This paper proposes a simple and effective algorithm for incorporating lexical constraints in neural machine translation. Previous work either required re-training existing models with the lexical constraints or incorporating them during beam search decoding with significantly higher computational overheads. Leveraging the flexibility and speed of a recently proposed Levenshtein Transformer model (Gu et al., 2019), our method injects terminology constraints at inference time without any impact on decoding spee… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
69
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(72 citation statements)
references
References 14 publications
2
69
1
Order By: Relevance
“…It also drastically speeds up decoding compared with lexically constrained decoding algorithms (Post and Vilar, 2018). Furthermore, results highlight the benefits of soft constraints over hard ones-EDITOR with soft constraints achieves translation quality on par or better than both EDITOR and Levenshtein Transformer with hard constraints (Susanto et al, 2020).…”
Section: Introductionmentioning
confidence: 84%
See 2 more Smart Citations
“…It also drastically speeds up decoding compared with lexically constrained decoding algorithms (Post and Vilar, 2018). Furthermore, results highlight the benefits of soft constraints over hard ones-EDITOR with soft constraints achieves translation quality on par or better than both EDITOR and Levenshtein Transformer with hard constraints (Susanto et al, 2020).…”
Section: Introductionmentioning
confidence: 84%
“…• NAR models: EDITOR and LevT view the lexical constraints as soft constraints, provided via the initial target sequence. We also explore the decoding technique introduced in Susanto et al (2020) to support hard constraints.…”
Section: Experimental Conditionsmentioning
confidence: 99%
See 1 more Smart Citation
“…The approach generates all the constraints in the final output. Other works (Hasler et al, 2018;Post and Vilar, 2018;Susanto et al, 2020) attempt to reduce the computational problem caused by using multiple beams in the inference, a well known weakness of this approach. Similar to the previous approach, constrained decoding does not consider target context when inserting translation terms, as it sets the target form and then produces a target context that fits this constraint.…”
Section: Related Workmentioning
confidence: 99%
“…Incorporating bilingual lexicons and Phrases into NMT. Our method is also inspired by the studies of incorporating bilingual lexicons and phrases into NMT (Arthur et al, 2016;Zhang and Zong, 2016;Feng et al, 2017;Hasler et al, 2018;Zhao et al, 2018b;Zhao et al, 2018a;Dinu et al, 2019;Huck et al, 2019;Liu et al, 2019;Susanto et al, 2020). They utilize the external bilingual lexicons and phrases to improve the lexical and phrases translation.…”
Section: Related Workmentioning
confidence: 99%