Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1453
|View full text |Cite
|
Sign up to set email alerts
|

Simple and Effective Paraphrastic Similarity from Parallel Translations

Abstract: We present a model and methodology for learning paraphrastic sentence embeddings directly from bitext, removing the timeconsuming intermediate step of creating paraphrase corpora. Further, we show that the resulting model can be applied to cross-lingual tasks where it both outperforms and is orders of magnitude faster than more complex stateof-the-art baselines. 1

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
45
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 35 publications
(45 citation statements)
references
References 37 publications
0
45
0
Order By: Relevance
“…Guo et al (2018) design a dual encoder model to learn multilingual sentence em-beddings directly with added negative examples. Wieting et al (2019) obtain sentence embeddings from sub-word embeddings and train a simpler model to distinguish positive and negative examples. Artetxe and Schwenk (2019) refine Guo et al (2018)'s work and achieve state-of-the-art by looking at the margins of cosine similarities between pairs of nearest neighbours.…”
Section: Related Workmentioning
confidence: 99%
“…Guo et al (2018) design a dual encoder model to learn multilingual sentence em-beddings directly with added negative examples. Wieting et al (2019) obtain sentence embeddings from sub-word embeddings and train a simpler model to distinguish positive and negative examples. Artetxe and Schwenk (2019) refine Guo et al (2018)'s work and achieve state-of-the-art by looking at the margins of cosine similarities between pairs of nearest neighbours.…”
Section: Related Workmentioning
confidence: 99%
“…In other NLP tasks, topic information is used as conditional signals and applied to dialogue response generation (Xing et al, 2017) and pretraining of large-scale language models (Keskar et al, 2019) while sentiment polarity is used in text style transfer (John et al, 2019). In image style transfer, codes specifying color or texture are used to train conditional generative models (Mirza and Osindero, 2014;Higgins et al, 2017).…”
Section: In Relation To Other Workmentioning
confidence: 99%
“…BertScore has performed better than ROUGE and BLEU in sentence-level semantic similarity assessment (Zhang et al, 2020). Moreover, BertScore includes recall measures between reference and candidate sequences, a more suitable metric than distance-based similarity measures (Wieting et al, 2019;Reimers and Gurevych, 2019) for summarization related tasks, where there is an asymmetrical relationship between the reference and the generated text.…”
Section: Similarity Metric: Semantic Affinity Vs Lexical Overlapmentioning
confidence: 99%
See 2 more Smart Citations