2019
DOI: 10.1007/978-981-15-1721-1_5
|View full text |Cite
|
Sign up to set email alerts
|

Neural Machine Translation with Attention Based on a New Syntactic Branch Distance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
2
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…The disadvantage of this model is that, compared with our method, the prior knowledge will be diluted when faced with long sentences. Peng et al [26] established the syntactic dependency matrix of each word based on the syntactic dependency tree and integrated quantitative syntactic knowledge into the translation model to guide translation, effectively learning syntactic details and eliminating the dispersion of attention scores.…”
Section: Related Workmentioning
confidence: 99%
“…The disadvantage of this model is that, compared with our method, the prior knowledge will be diluted when faced with long sentences. Peng et al [26] established the syntactic dependency matrix of each word based on the syntactic dependency tree and integrated quantitative syntactic knowledge into the translation model to guide translation, effectively learning syntactic details and eliminating the dispersion of attention scores.…”
Section: Related Workmentioning
confidence: 99%
“…Dependency distance In line with previous work [5,17,18,25], the dependency tree is extracted from the sentence by an external syntax parser, which is employed to derive the word-level dependency distance. The dependency distance is defined as the length of the path traversed from a word to another word on the tree, and the distance between two connected words is assigned as 1.…”
Section: Approachmentioning
confidence: 99%