Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing 2018
DOI: 10.18653/v1/d18-1368
|View full text |Cite
|
Sign up to set email alerts
|

Building Context-aware Clause Representations for Situation Entity Type Classification

Abstract: Capabilities to categorize a clause based on the type of situation entity (e.g., events, states and generic statements) the clause introduces to the discourse can benefit many NLP applications. Observing that the situation entity type of a clause depends on discourse functions the clause plays in a paragraph and the interpretation of discourse functions depends heavily on paragraph-wide contexts, we propose to build context-aware clause representations for predicting situation entity types of clauses. Specific… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 25 publications
0
8
0
Order By: Relevance
“…Xu et al (2015) propose a shortest dependency path LSTM for sentence representation in the task of relation classification. Dai and Huang (2018) propose a BiLSTM based model that combines paragraph vectors and word vectors into clause embeddings for a situation entity classification task. Connective prediction has often been addressed: Ji and Eisenstein (2015) and Rutherford et al (2017) use recursive neural networks with parse trees as input to predict connectives and discourse relations, with solid improvements on PDTB.…”
Section: Related Workmentioning
confidence: 99%
“…Xu et al (2015) propose a shortest dependency path LSTM for sentence representation in the task of relation classification. Dai and Huang (2018) propose a BiLSTM based model that combines paragraph vectors and word vectors into clause embeddings for a situation entity classification task. Connective prediction has often been addressed: Ji and Eisenstein (2015) and Rutherford et al (2017) use recursive neural networks with parse trees as input to predict connectives and discourse relations, with solid improvements on PDTB.…”
Section: Related Workmentioning
confidence: 99%
“…Xu et al (2015) propose a shortest dependency path LSTM for sentence representation in the task of relation classification. Dai and Huang (2018) propose a BiLSTM based model that combines paragraph vectors and word vectors into clause embeddings for a situation entity classification task. Connective prediction has often been addressed: Ji and Eisenstein (2015) and Rutherford et al (2017) use recursive neural networks with parse trees as input to predict connectives and discourse relations, with solid improvements on PDTB.…”
Section: Related Workmentioning
confidence: 99%
“…Several studies on entity typing have been conducted in the past [9][10][11][12][13][14][15][16][17][18][19][20][21][22][23][24][25]. In general, these studies can be divided into unsupervised and semi-supervised groups depending on the approach used.…”
Section: Related Workmentioning
confidence: 99%
“…By contrast, the semi-supervised approach focuses on learning the representation of categories or entities. Dai et al [24] proposed context-aware clause representations to predict the situation entity types of the clauses. In recent years, language pretrained models have been developed rapidly.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation