2021
DOI: 10.48550/arxiv.2111.11707
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deps-SAN: Neural Machine Translation with Dependency-Scaled Self-Attention Network

Abstract: The neural machine translation model assumes that syntax knowledge can be learned from the bilingual corpus via an attention network automatically. However, the attention network trained in weak supervision actually cannot capture the deep structure of the sentence. Naturally, we expect to introduce external syntax knowledge to guide the learning of the attention network. Thus, we propose a novel, parameterfree, dependency-scaled self-attention network, which integrates explicit syntactic dependencies into the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 21 publications
0
1
0
Order By: Relevance
“…Previous research of Transformer [29,32,46] found that lower networks are inclined to focus on structural information among words, while higher layers (usually above the 4th layer) preferred to capture sentence-level semantics. Based on this cognition, we fuse visual features that are semantically different from the text at higher layers, which moderate the corruption in the original semantic modelling of sentences.…”
Section: Effect Of Fusion Position and Rnn Typementioning
confidence: 96%
“…Previous research of Transformer [29,32,46] found that lower networks are inclined to focus on structural information among words, while higher layers (usually above the 4th layer) preferred to capture sentence-level semantics. Based on this cognition, we fuse visual features that are semantically different from the text at higher layers, which moderate the corruption in the original semantic modelling of sentences.…”
Section: Effect Of Fusion Position and Rnn Typementioning
confidence: 96%