Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1280
|View full text |Cite
|
Sign up to set email alerts
|

Fine-Grained Temporal Relation Extraction

Abstract: We present a novel semantic framework for modeling temporal relations and event durations that maps pairs of events to real-valued scales. We use this framework to construct the largest temporal relations dataset to date, covering the entirety of the Universal Dependencies English Web Treebank. We use this dataset to train models for jointly predicting fine-grained temporal relations and event durations. We report strong results on our data and show the efficacy of a transfer-learning approach for predicting c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
76
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 50 publications
(76 citation statements)
references
References 31 publications
0
76
0
Order By: Relevance
“…'GCL' (Meng and Rumshisky, 2018) is the global context layer model introduced in § 2.1. 'Fine-grained TRC ' Vashishtha et al (2019) which suggests the effectiveness of two main proposal. The 'SEC' model further outperforms 'Multi-BERT' by 3.6 gain of the majority category E2E, 1.0 gain of E2T and 0.7 gain of E2D, which indicates the impact of the global SEC RNN.…”
Section: Main Timebank-dense Resultsmentioning
confidence: 98%
See 1 more Smart Citation
“…'GCL' (Meng and Rumshisky, 2018) is the global context layer model introduced in § 2.1. 'Fine-grained TRC ' Vashishtha et al (2019) which suggests the effectiveness of two main proposal. The 'SEC' model further outperforms 'Multi-BERT' by 3.6 gain of the majority category E2E, 1.0 gain of E2T and 0.7 gain of E2D, which indicates the impact of the global SEC RNN.…”
Section: Main Timebank-dense Resultsmentioning
confidence: 98%
“…Most existing temporal relation classification approaches focus on extracting various features from the textual sentence in the local pair-wise setting. Inspired by the success of neural networks in various NLP tasks, Cheng and Miyao (2017); Meng et al (2017); Vashishtha et al (2019); Han et al (2019b,a) propose a series of neural networks to achieve accuracy with less feature engineering. However, these neural models still drop in the pairwise setting.…”
Section: Temporal Relation Classificationmentioning
confidence: 99%
“…Recently, there have been interesting developments in annotating news texts with relative temporal information [9], [33], which are out of the scope of this work as we focus on extracting absolute timelines, which can be interpreted directly on the calendar.…”
Section: A Event Positionmentioning
confidence: 99%
“…One problem with annotating preconditions in text is the large number of event mentions in each article, which means annotation of all possible event pairs is infeasible. The temporal community has struggled with this same dilemma (Chambers et al, 2014;Vashishtha et al, 2019).…”
Section: Preconditions Datasetmentioning
confidence: 99%