2018
DOI: 10.1162/tacl_a_00017
|View full text |Cite
|
Sign up to set email alerts
|

Scheduled Multi-Task Learning: From Syntax to Translation

Abstract: Neural encoder-decoder models of machine translation have achieved impressive results, while learning linguistic knowledge of both the source and target languages in an implicit end-to-end manner. We propose a framework in which our model begins learning syntax and translation interleaved, gradually putting more focus on translation. Using this approach, we achieve considerable improvements in terms of BLEU score on relatively large parallel corpus (WMT14 English to German) and a lowresource (WIT German to Eng… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
78
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 72 publications
(79 citation statements)
references
References 31 publications
1
78
0
Order By: Relevance
“…The papers mentioned so far targeted primarily the quality of MT (as measured by BLEU), not the secondary tasks. [13] note that their system performs reasonably well in both tagging and parsing. [31] present an in-depth analysis of the syntactic knowledge learned by the recurrent sequence-to-sequence NMT.…”
Section: Related Workmentioning
confidence: 92%
See 4 more Smart Citations
“…The papers mentioned so far targeted primarily the quality of MT (as measured by BLEU), not the secondary tasks. [13] note that their system performs reasonably well in both tagging and parsing. [31] present an in-depth analysis of the syntactic knowledge learned by the recurrent sequence-to-sequence NMT.…”
Section: Related Workmentioning
confidence: 92%
“…In parallel to our work, [13] examined various scheduling strategies for a very simple approach to multi-tasking: representing all the tasks converted to a common format of source and target sequences of symbols from a joint vocabulary and training one sequence-to-sequence system on the mix of training examples from the different tasks.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations