2022
DOI: 10.48550/arxiv.2209.07634
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stateful Memory-Augmented Transformers for Dialogue Modeling

Abstract: Transformer encoder-decoder models have shown impressive performance in dialogue modeling. However, as Transformers are inefficient in processing long sequences, dialogue history length often needs to be truncated. To address this problem, we propose a new memory-augmented Transformer that is compatible with existing pre-trained encoderdecoder models and enables efficient preservation of history information. It incorporates a separate memory module alongside the pretrained Transformer to effectively interchang… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 8 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?