2017
DOI: 10.1007/978-3-319-66435-4_5
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourcing for Information Visualization: Promises and Pitfalls

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
2

Relationship

3
6

Authors

Journals

citations
Cited by 23 publications
(23 citation statements)
references
References 87 publications
0
22
0
Order By: Relevance
“…We used Amazon's Mechanical Turk (MTurk) to crowdsource our study to a broad population. We followed bestpractices recommendations for crowdsourcing-based studies for visualization [62]. In the MTurk posting, we showed participants a visual sample of the tasks they will be performing with the NL or AM visualizations, informed them they will be performing the study with either the NL or AM visualization, and directed them to further information available on the webpage for the study.…”
Section: Methodsmentioning
confidence: 99%
“…We used Amazon's Mechanical Turk (MTurk) to crowdsource our study to a broad population. We followed bestpractices recommendations for crowdsourcing-based studies for visualization [62]. In the MTurk posting, we showed participants a visual sample of the tasks they will be performing with the NL or AM visualizations, informed them they will be performing the study with either the NL or AM visualization, and directed them to further information available on the webpage for the study.…”
Section: Methodsmentioning
confidence: 99%
“…Our experimental comparison of visualization design alternatives continues a tradition established by Cleveland and McGill [22] and continued by the crowdsourced graphical perception work of Heer and Bostock [31]. Our experiment also makes use of a crowdsourcing platform, which serves to circumvent some of the limitations of directly observed lab studies while providing a diverse and large participant pool [12,13]. Our work follows two recent crowdsourced experiments relating to visualization on mobile phones: our previous study comparing alternative ways to visualize ranges on mobile phones [17], and Schwab et al's study of panning and zooming techniques with temporal data [54].…”
Section: (Mobile) Visualization Evaluation Studiesmentioning
confidence: 99%
“…Following the crowdsourced graphical perception work of Heer and Bostock [32], our experiment involved the use of a crowdsourcing platform, which helps to overcome the limitations of controlled lab studies as it provides a diverse and large participant pool [12]. To our knowledge, our work is the first to conduct a visualization evaluation study on participants' mobile phones leveraging a crowd platform.…”
Section: Visualization Evaluation Studiesmentioning
confidence: 99%