2022
DOI: 10.48550/arxiv.2204.04289
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards Understanding Large-Scale Discourse Structures in Pre-Trained and Fine-Tuned Language Models

Abstract: With a growing number of BERTology work analyzing different components of pre-trained language models, we extend this line of research through an in-depth analysis of discourse information in pre-trained and finetuned language models. We move beyond prior work along three dimensions: First, we describe a novel approach to infer discourse structures from arbitrarily long documents. Second, we propose a new type of analysis to explore where and how accurately intrinsic discourse is captured in the BERT and BART … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 41 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?