Proceedings of the 13th Linguistic Annotation Workshop 2019
DOI: 10.18653/v1/w19-4003
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourcing Discourse Relation Annotations by a Two-Step Connective Insertion Task

Abstract: The perspective of being able to crowd-source coherence relations bears the promise of acquiring annotations for new texts quickly, which could then increase the size and variety of discourse-annotated corpora. It would also open the avenue to answering new research questions: Collecting annotations from a larger number of individuals per instance would allow to investigate the distribution of inferred relations, and to study individual differences in coherence relation interpretation. However, annotating cohe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 9 publications
0
17
1
Order By: Relevance
“…This result was not observed in previous studies, where the context in texts either facilitated the performance with connectives or did not influence it at all (Scholman & Demberg, 2017; Yung, Demberg, & Scholman, 2019). This discrepancy can in the first instance be explained by our text‐level cloze task being rather different from those used in the studies mentioned.…”
Section: Discussioncontrasting
confidence: 93%
“…This result was not observed in previous studies, where the context in texts either facilitated the performance with connectives or did not influence it at all (Scholman & Demberg, 2017; Yung, Demberg, & Scholman, 2019). This discrepancy can in the first instance be explained by our text‐level cloze task being rather different from those used in the studies mentioned.…”
Section: Discussioncontrasting
confidence: 93%
“…Such distinctions should preferably be picked out at the very beginning to be incorporated fully into the annotation schema. Tasks related to grouping and connectivity annotation could be crowd-sourced relatively easily, whereas annotating diagram types and discourse relations may require multi-step procedures and assistance in the form of prompts, as Yung et al (2019) have recently shown for RST. Involving both expert and crowd-sourced annotators could also alleviate problems related to circularity by forcing domain experts to frame the tasks in terms understandable to crowd-sourced workers (Riezler, 2014).…”
Section: Discussionmentioning
confidence: 99%
“…This kind of non-theoretical grounding (Riezler 2014) could help to break circularity by evaluating, for instance, whether naive annotators perceive diagram elements to form visual groups (grouping) or whether arrows and lines are considered to signal connections between individual diagram elements or visual groups (connectivity). For discourse structure annotation, Yung et al (2019) introduce a multi-step procedure for sourcing descriptions of discourse relations [-0.23, -0.45, -0.31, -0.44, -0.15, -0.04, -0.25, 2.46, -0.11, -0.47, -0.23, 3.65, 1.76, 1.42, 1.58, 1.24, -0.83, -0.67, 2.26, -0.25, 1.15, -0.44, 1.83, -0.33, 0.73, -0.34, -0.25, -0.21, -0.16, -0.1 , -0.13, -0.13, -0.07, -0.12, -0.1 , -0.07, -0.04, -0.03, -0.03, -0.03, -0.03, 1.53, 1.72, -0.61, 2.3, -0.05] 6 Exploring the AI2D-RST corpus…”
Section: On the Reliability And Reproducibility Of The Ai2d-rst Annotmentioning
confidence: 99%