Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics 2014
DOI: 10.3115/v1/e14-1028
|View full text |Cite
|
Sign up to set email alerts
|

Word Ordering with Phrase-Based Grammars

Abstract: We describe an approach to word ordering using modelling techniques from statistical machine translation. The system incorporates a phrase-based model of string generation that aims to take unordered bags of words and produce fluent, grammatical sentences. We describe the generation grammars and introduce parsing procedures that address the computational complexity of generation under permutation of phrases. Against the best previous results reported on this task, obtained using syntax driven models, we report… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
12
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 22 publications
0
12
0
Order By: Relevance
“…Zhang et al (2012) improve the CCG approach by Zhang and Clark (2011) by incorporating an N-gram language model. de Gispert et al (2014) present a similar N-gram language model approach to ours with a different decoder that does not guarantee optimal results. In their comparison with approach by Zhang et al (2012) they report gains of more than 20 BLEU points.…”
Section: Related Workmentioning
confidence: 99%
“…Zhang et al (2012) improve the CCG approach by Zhang and Clark (2011) by incorporating an N-gram language model. de Gispert et al (2014) present a similar N-gram language model approach to ours with a different decoder that does not guarantee optimal results. In their comparison with approach by Zhang et al (2012) they report gains of more than 20 BLEU points.…”
Section: Related Workmentioning
confidence: 99%
“…Numbers in bold mark the best result for a given model. We compare against the LMbased method of de Gispert et al (2014) and the n-gram and RNNLM (LSTM) models of Schmaltz et al (2016), of which the latter achieves the best BLEU score of 42.7. We can reproduce or surpass prior work for n-gram and RNNLM and show that g(·) outperforms f (·) for these models.…”
Section: Word Ordering On the Penn Treebankmentioning
confidence: 99%
“…It has also been addressed as LM-based linearization which relies solely on language models and obtains better Work partially supported by U.K. EPSRC grant EP/L027623/1. scores (de Gispert et al, 2014;Schmaltz et al, 2016). Recently, Schmaltz et al (2016) showed that recurrent neural network language models (Mikolov et al, 2010, RNNLMs) with long short-term memory (Hochreiter and Schmidhuber, 1997, LSTM) cells are very effective for word ordering even without any explicit syntactic information.…”
Section: Introductionmentioning
confidence: 99%
“…, Zhang (2013) and Song et al (2014) further extended this line of research by adding input syntax and allowing joint inflection and ordering. de Gispert et al (2014) use a phrase-structure grammer for word ordering. Our generation system is based on the work of Zhang (2013), but further allows lexical selection.…”
Section: Related Workmentioning
confidence: 99%