Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural 2009
DOI: 10.3115/1690219.1690263
|View full text |Cite
|
Sign up to set email alerts
|

Application-driven statistical paraphrase generation

Abstract: Paraphrase generation (PG) is important in plenty of NLP applications. However, the research of PG is far from enough. In this paper, we propose a novel method for statistical paraphrase generation (SPG), which can (1) achieve various applications based on a uniform statistical model, and (2) naturally combine multiple resources to enhance the PG performance. In our experiments, we use the proposed method to generate paraphrases for three different applications. The results show that the method can be easily t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
57
0
4

Year Published

2012
2012
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 76 publications
(61 citation statements)
references
References 20 publications
0
57
0
4
Order By: Relevance
“…Zhao et al [2008] apply SMT decoding for paraphrasing, using several log-linear weighted resources (phrase table, thesaurus, etc. ), while Zhao et al [2009] filter out paraphrase candidates and weight paraphrase features according to the desired NLP task: sentence compression, simplification, or similarity computation. Malakasiotis [2009] propose paraphrase recognition using machine learning techniques to combine similarity measures.…”
Section: Paraphrasing Approachesmentioning
confidence: 99%
“…Zhao et al [2008] apply SMT decoding for paraphrasing, using several log-linear weighted resources (phrase table, thesaurus, etc. ), while Zhao et al [2009] filter out paraphrase candidates and weight paraphrase features according to the desired NLP task: sentence compression, simplification, or similarity computation. Malakasiotis [2009] propose paraphrase recognition using machine learning techniques to combine similarity measures.…”
Section: Paraphrasing Approachesmentioning
confidence: 99%
“…6 Zhao et al [2009] extend this approach by using multiple phrase tables. Their rationale is that monolingual corpora are in short supply in comparison to bilingual text and as a result give rise to relatively sparse phrase tables.…”
Section: Related Workmentioning
confidence: 99%
“…Thus combining multiple resources into a single phrase table mitigates this problem. Although Quirk et al [2004] aim at generating target sentences that are meaning preserving and do not delete any information from the source, Zhao et al [2009] show that a phrase-based model can generate compressed sentences by selecting only translations where the target phrases are shorter than the source ones. More recently, Ganitkevitch et al [2011] generalize Quirk et al's model to syntactic paraphrases and discuss how such a model can be adapted to sentence compression by augmenting the feature set with compression target features and by optimizing appropriately the system's training objective in a fashion similar to Zhao et al Our own work builds on the model developed by Cohn and Lapata [2009] and formulates abstractive compression as a tree-to-tree rewriting task.…”
Section: Related Workmentioning
confidence: 99%
“…Their aim was to report duplicate bugs, and to do this they used sentence selection, and global context-based and cooccurrence-based scoring. Studies have also been done in paraphrase generation in NLG (Zhao et al 2009;Chevelu et al 2009). …”
Section: Paraphrasing With Corporamentioning
confidence: 99%