Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.2
|View full text |Cite
|
Sign up to set email alerts
|

Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism

Abstract: Non-autoregressive models generate target words in a parallel way, which achieve a faster decoding speed but at the sacrifice of translation accuracy. To remedy a flawed translation by non-autoregressive models, a promising approach is to train a conditional masked translation model (CMTM), and refine the generated results within several iterations. Unfortunately, such approach hardly considers the sequential dependency among target words, which inevitably results in a translation degradation. Hence, instead o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

4
4

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 12 publications
0
5
0
Order By: Relevance
“…(2) introducing self-review mechanism [87], which applies an AR-decoder to help infuse sequential information; (3) selecting masked tokens with advanced strategies [26], [34], [88]. Geng et al [26] focus on the importance of determining the tokens replaced by [mask] tokens in next iteration and propose a revisor and locator for rewriting.…”
Section: Encoder Stackmentioning
confidence: 99%
“…(2) introducing self-review mechanism [87], which applies an AR-decoder to help infuse sequential information; (3) selecting masked tokens with advanced strategies [26], [34], [88]. Geng et al [26] focus on the importance of determining the tokens replaced by [mask] tokens in next iteration and propose a revisor and locator for rewriting.…”
Section: Encoder Stackmentioning
confidence: 99%
“…Headline Generation. In recent years, text generation has made impressive progress (Li et al 2019;Chan et al 2019;Liu et al 2020;Xie et al 2020;Chan et al 2020;Chen et al 2021), and headline generation has become a research hotspot in Natural Language Processing. Most existing headline generation works solely focus on summarizing the document.…”
Section: Related Workmentioning
confidence: 99%
“…Multi-turn Dialog. In recent years, text generation has made impressive progress (Li et al 2018;Chan et al 2019;Gao et al 2020b;Xie et al 2020), and multi-turn dialog model aims to take a message and utterances in previous turns as input and generates a response (Tao et al 2019;Gao et al 2020a). Several works (Zhang et al 2019;Adiwardana et al 2020;Chan et al 2020) simplify the multi-turn dialog into single-turn problem by simply concatenating multiple sentences into one sentence, and utilized the basic Seq2seq based on RNN or Transformer to model long sequence.…”
Section: Related Workmentioning
confidence: 99%