Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.432
|View full text |Cite
|
Sign up to set email alerts
|

Annotating Temporal Dependency Graphs via Crowdsourcing

Abstract: We present the construction of a corpus of 500 Wikinews articles annotated with temporal dependency graphs (TDGs) that can be used to train systems to understand temporal relations in text. We argue that temporal dependency graphs, built on previous research on narrative times and temporal anaphora, provide a representation scheme that achieves a good balance between completeness and practicality in temporal annotation. We also provide a crowdsourcing strategy to annotate TDGs, and demonstrate the feasibility … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(14 citation statements)
references
References 17 publications
0
14
0
Order By: Relevance
“…(Zhang and Xue, 2018b) provided the the earliest TDT corpus on news data and narrative stories, (Zhang and Xue, 2019) released the first English TDT corpus. Yao et al (2020a) relaxed the assumption of single reference edge in dependency trees to form the improved TDG. (Zhang and Xue, 2018a) built an end-to-end neural temporal dependency parser using BiLSTM and Ross et al (2020b) improved it further incorporating BERT.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…(Zhang and Xue, 2018b) provided the the earliest TDT corpus on news data and narrative stories, (Zhang and Xue, 2019) released the first English TDT corpus. Yao et al (2020a) relaxed the assumption of single reference edge in dependency trees to form the improved TDG. (Zhang and Xue, 2018a) built an end-to-end neural temporal dependency parser using BiLSTM and Ross et al (2020b) improved it further incorporating BERT.…”
Section: Related Workmentioning
confidence: 99%
“…events that are certain to happen vs the ones that might happen), event ambiguity (eg. agreeing to terms of a contract vs signing a contract) and need for complete annotation of all event pairs for precise temporal localization (Yao et al, 2020a).…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…It also adds a representation for aspect and quantifier scope, which are not part of AMR. At the document level, UMR represents temporal (Zhang and Xue, 2018b,a;Yao et al, 2020) and modal dependencies (Vigus et al, 2019) as well as coreference. UMR abstracts away from syntactic representations and preserves semantic relations within and across sentences.…”
Section: Introduction 1umr Overviewmentioning
confidence: 99%