Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.51
|View full text |Cite
|
Sign up to set email alerts
|

Joint Constrained Learning for Event-Event Relation Extraction

Abstract: Understanding natural language involves recognizing how multiple event mentions structurally and temporally interact with each other.In this process, one can induce event complexes that organize multi-granular events with temporal order and membership relations interweaving among them. Due to the lack of jointly labeled data for these relational phenomena and the restriction on the structures they articulate, we propose a joint constrained learning framework for modeling event-event relations. Specifically, th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
64
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 74 publications
(65 citation statements)
references
References 27 publications
1
64
0
Order By: Relevance
“…Table 3 shows the performance of our model and the baselines. We see that our model is consistently better than BaseLM, and at the same time, comparable to Wang et al (2020). Our model benefits more from input contexts, and only drops 4% in the OT-MS setting with minimal supervision (from 89.6 to 86.1), comparing to the 10% drop from T5-Large.…”
Section: Extrinsic Evaluationmentioning
confidence: 57%
See 1 more Smart Citation
“…Table 3 shows the performance of our model and the baselines. We see that our model is consistently better than BaseLM, and at the same time, comparable to Wang et al (2020). Our model benefits more from input contexts, and only drops 4% in the OT-MS setting with minimal supervision (from 89.6 to 86.1), comparing to the 10% drop from T5-Large.…”
Section: Extrinsic Evaluationmentioning
confidence: 57%
“…We report four results -OT-NS (original test, no story): train and test with only the sentences containing the trigger verbs; OT: train and test with the entire document (down-sampled to be below the maximum sequence length) as an auxiliary input; OT-MS (original test, minimal supervision): train with 1.2k (6%) training instances; PT (perturbed test): train with the complete training set and test on a perturbed test set from . In OT-NS, we also report a SOTA system from Wang et al (2020) under the same two-label 10 setting.…”
Section: Extrinsic Evaluationmentioning
confidence: 99%
“…Constrained learning in neural models. Another line of related work concerns the use of constraints in neural models (Li et al, 2019;Wang et al, 2020), where constraints are represented as first order logic formulas and compiled into the loss functions. These models are typically trained to minimize the weighted sum of task losses and constraint losses.…”
Section: Related Workmentioning
confidence: 99%
“…The intuition is that the higher level and more abstract events are relatively more salient. We use the model from Wang et al (2020) to identify the parent-child relationship between every event pair. An event is called a child of another event if it is a subevent of the parent (e.g., "shooting" may be a subevent/child of an "attack" event).…”
Section: Event Augmentation With Global Featuresmentioning
confidence: 99%