Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.415
|View full text |Cite
|
Sign up to set email alerts
|

Zero-Shot Crosslingual Sentence Simplification

Abstract: Sentence simplification aims to make sentences easier to read and understand. Recent approaches have shown promising results with encoder-decoder models trained on large amounts of parallel data which often only exists in English. We propose a zero-shot modeling framework which transfers simplification knowledge from English to another language (for which no parallel simplification corpus exists) while generalizing across languages and tasks. A shared transformer encoder constructs language-agnostic representa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(11 citation statements)
references
References 39 publications
0
11
0
Order By: Relevance
“…Due to those, it is not clear if the recently proposed neural ATS systems, published in top tier NLP/CL/AI conferences, e.g. (Nisioi et al, 2017;Zhang and Lapata, 2017;Surya et al, 2019;Kumar et al, 2020;Mallinson et al, 2020), represent a real step forward toward using ATS for social good.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Due to those, it is not clear if the recently proposed neural ATS systems, published in top tier NLP/CL/AI conferences, e.g. (Nisioi et al, 2017;Zhang and Lapata, 2017;Surya et al, 2019;Kumar et al, 2020;Mallinson et al, 2020), represent a real step forward toward using ATS for social good.…”
Section: Discussionmentioning
confidence: 99%
“…The state-of-the-art ATS systems published in top tier NLP/CL/AI conferences, e.g. (Nisioi et al, 2017;Zhang and Lapata, 2017;Surya et al, 2019;Kumar et al, 2020;Mallinson et al, 2020), all describe end-to-end systems for sentence simplification, and are not directed towards any particular simplification transformation or target population.…”
Section: Ats Research Trendsmentioning
confidence: 99%
See 1 more Smart Citation
“…Arguably, unaligned resources might still be helpful to facilitate pre-training of models. In an attempt to circumvent data scarcity, (Mallinson et al, 2020) ploy multi-lingual pre-training, which they tested with a small, manually labeled German evaluation set.…”
Section: Text Simplificationmentioning
confidence: 99%
“…Some unsupervised or semi-supervised neural models, which reduce the need for parallel data, have reached similar performances (Surya et al, 2019;Kumar et al, 2020;Zhao et al, 2020;Martin et al, 2020b). Finally, experiments with multi-task learning have shown promising results (Guo et al, 2018;Dmitrieva and Tiedemann, 2021), with the possibility of zero-shot translations for languages without any parallel data (Mallinson et al, 2020). These approaches represent the current state of the art, but are largely limited to English (Al-Thanyyan and Azmi, 2021) due to a lack of training data in other languages.…”
Section: Related Workmentioning
confidence: 99%