2023
DOI: 10.1109/taslp.2022.3221040
|View full text |Cite
|
Sign up to set email alerts
|

GTrans: Grouping and Fusing Transformer Layers for Neural Machine Translation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(1 citation statement)
references
References 42 publications
0
1
0
Order By: Relevance
“…Building upon that, proposed Viterbi decoding and further explored fuzzy alignment training, achieving the current state-of-the-art NAR model performance. Apart from text translation, the NAR model also demonstrated impressive performance in diverse areas such as speech-to-text translation (Xu et al, 2023), speech-to-speech translation (Fang et al, 2023) and text-to-speech synthesis (Ren et al, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…Building upon that, proposed Viterbi decoding and further explored fuzzy alignment training, achieving the current state-of-the-art NAR model performance. Apart from text translation, the NAR model also demonstrated impressive performance in diverse areas such as speech-to-text translation (Xu et al, 2023), speech-to-speech translation (Fang et al, 2023) and text-to-speech synthesis (Ren et al, 2021).…”
Section: Related Workmentioning
confidence: 99%