Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.29
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous Graph Neural Networks to Predict What Happen Next

Abstract: Given an incomplete event chain, script learning aims to predict the missing event, which can support a series of NLP applications. Existing work cannot well represent the heterogeneous relations and capture the discontinuous event segments that are common in the event chain. To address these issues, we introduce a heterogeneous-event (HeterEvent) graph network. In particular, we employ each unique word and individual event as nodes in the graph, and explore three kinds of edges based on realistic relations (e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 24 publications
(17 citation statements)
references
References 26 publications
0
17
0
Order By: Relevance
“…The UniFA-S (Zheng et al, 2020a) adopted Variational AutoEncoder architecture(Kingma and Welling, 2014) with a unified fine-tuning method to learn event representation from intra-inter-event and scenario level. HeterEvent (Zheng et al, 2020b) proposed a heterogeneous graph neural network that models discontinuous event segments explicitly. In the external knowledge enriched representation research line, researchers have attempted several external knowledge into event representation.…”
Section: Narrative Event Representationmentioning
confidence: 99%
See 4 more Smart Citations
“…The UniFA-S (Zheng et al, 2020a) adopted Variational AutoEncoder architecture(Kingma and Welling, 2014) with a unified fine-tuning method to learn event representation from intra-inter-event and scenario level. HeterEvent (Zheng et al, 2020b) proposed a heterogeneous graph neural network that models discontinuous event segments explicitly. In the external knowledge enriched representation research line, researchers have attempted several external knowledge into event representation.…”
Section: Narrative Event Representationmentioning
confidence: 99%
“…Besides, Lv et al (2019) employs a self-attention mecha-nism (Lin et al, 2017) to represent the event chain in diverse event segments within the chain implicitly. Zheng et al (2020b) adopt the graph attention network (Velickovic et al, 2018) to aggregate neighborhood events information. We employ the multi-head attention (Vaswani et al, 2017) to extract circumstance representation from the event sentence and aggregate circumstances in the global level adaptively.…”
Section: Attention Mechanism In Narrative Event Predictionmentioning
confidence: 99%
See 3 more Smart Citations