Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1383
|View full text |Cite
|
Sign up to set email alerts
|

Pretrained Language Models for Sequential Sentence Classification

Abstract: As a step toward better document-level understanding, we explore classification of a sequence of sentences into their corresponding categories, a task that requires understanding sentences in context of the document. Recent successful models for this task have used hierarchical models to contextualize sentence representations, and Conditional Random Fields (CRFs) to incorporate dependencies between subsequent labels. In this work, we show that pretrained language models, BERT (Devlin et al., 2018) in particu… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
92
2

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 90 publications
(95 citation statements)
references
References 17 publications
1
92
2
Order By: Relevance
“…Of these, only one specifically performs sequential sentence classification (i.e. the task of providing labels for each of the sentences in the multi-sentence input) (Cohan et al, 2019). There exist non-PLM approaches to sequential sentence classification as well.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Of these, only one specifically performs sequential sentence classification (i.e. the task of providing labels for each of the sentences in the multi-sentence input) (Cohan et al, 2019). There exist non-PLM approaches to sequential sentence classification as well.…”
Section: Related Workmentioning
confidence: 99%
“…To involve direct textual context, we use a Windowed Sequential Sentence Classification method (WinSSC). Like Cohan et al (2019)'s method of using pre-trained language models for sequential sentence classification, WinSSC takes multiple sentences as its input sequence, generates embeddings for the separator tokens in the sequence, and classifies these embeddings with a linear layer that outputs as many labels as there are sentences in the input sequence. Prior to embedding, sequences are book-ended with the last sentence from the previous sequence, and the first sentence of the next sequence.…”
Section: Approachesmentioning
confidence: 99%
“…The above summary generated by standard copy mechanism miss some importance words, such as "obama" and "nominees". on many NLP tasks, including machine translation (Vaswani et al, 2017;Dehghani et al, 2019), sentence classification (Devlin et al, 2019;Cohan et al, 2019), and text summarization (Song et al, 2019;.…”
Section: Introductionmentioning
confidence: 99%
“…Sun et al (2019a) proposed to construct an auxiliary sentence to solve aspect-based sentiment classification tasks. Cohan et al (2019) added extra separate tokens to obtain representations of each sentence to solve sequential sentence classification tasks. Sun et al (2019b) summarized several fine-tuning methods, including fusing text representations from different layers, utilizing multi-task learning, etc.…”
Section: Introductionmentioning
confidence: 99%