2022
DOI: 10.48550/arxiv.2209.09146
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Semantic-based Pre-training for Dialogue Understanding

Abstract: Pre-trained language models have made great progress on dialogue tasks. However, these models are typically trained on surface dialogue text, thus are proven to be weak in understanding the main semantic meaning of a dialogue context. We investigate Abstract Meaning Representation (AMR) as explicit semantic knowledge for pre-training models to capture the core semantic information in dialogues during pre-training. In particular, we propose a semantic-based pre-training framework that extends the standard pretr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 38 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?