2015
DOI: 10.1016/j.csl.2014.07.002
|View full text |Cite
|
Sign up to set email alerts
|

A swarm-inspired re-ranker system for statistical machine translation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 21 publications
0
11
0
Order By: Relevance
“…Various global features were investigated for SMT re-ranking, such as the decoder's scores, source and target sentences, alignments and POS tags, sentence type probabilities, posterior probabilities and back translation features. More recently, Farzi and Faili (2015) proposed a re-ranking system based on swarm algorithms.…”
Section: Related Workmentioning
confidence: 99%
“…Various global features were investigated for SMT re-ranking, such as the decoder's scores, source and target sentences, alignments and POS tags, sentence type probabilities, posterior probabilities and back translation features. More recently, Farzi and Faili (2015) proposed a re-ranking system based on swarm algorithms.…”
Section: Related Workmentioning
confidence: 99%
“…Experiments with English-to-Portuguese translation showed a significant improvement that varied between 1.5 and 2.5 absolute BLEU points. Farzi and Faili (2015) used a set of non-syntactical features to re-rank the n-best translation candidates generated by a Phrase-based Statistical Machine Translation system. They investigated several feature weights optimization algorithms such as Particle Swarm Optimization (PSO), Quantum-behaved Particle Swarm Optimization (QPSO), Genetic Algorithms (GA), Perceptron and Averaged Perceptron.…”
Section: Re-ranking By Including Additional Featuresmentioning
confidence: 99%
“…The features set that we have proposed covers the lexical, syntactic and semantic aspects of the translation candidates (the n-best list candidates) and can be grouped into five classes: (1) Translation-based Features: these features are related to a translation model and can be helpful for promoting translation adequacy; (2) Fluency Features: these features are used to promote syntactic fluency by incorporating language models; (3) Length-based Features: these features are used to promote the n-best list candidates according to the likelihood of their lengths; (4) N-best list Features: these features are extracted directly from the n-best list and are used to promote the most likely candidates in it; and (5) Embedding Features: these are based on bilingual word embeddings and are used to cover the semantic aspect of the translation candidates. For the problem of feature weights optimization, we present a methodology that is fairly similar to the one presented by Farzi and Faili (2015). It uses a Quantum-behaved Particle Swarm Optimization (QPSO) which guarantees the global convergence of the optimization process.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations