Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems 2016
DOI: 10.1145/2858036.2858280
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating Information Visualization via the Interplay of Heuristic Evaluation and Question-Based Scoring

Abstract: Figure 1. Three visualizations of the same dataset (from the game show Jeopardy!), designed to be of (a) low, (b) high, and (c) middling usability. ABSTRACTIn an instructional setting it can be difficult to accurately assess the quality of information visualizations of several variables. Instead of a standard design critique, an alternative is to ask potential readers of the chart to answer questions about it. A controlled study with 47 participants shows a good correlation between aggregated novice heuristic … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2017
2017
2020
2020

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(11 citation statements)
references
References 28 publications
1
10
0
Order By: Relevance
“…• used crowdsourcing only for the pilot study (e.g., [TGH12]); • used crowdsourcing not for evaluation but for other purposes such as data collection (e.g., user generated layouts [KDMW16, KWD14, vHR08]); • evaluated graphics but no information visualization (e.g., [ • discussed previous crowdsourcing studies without using crowdsourcing (e.g., [AJB16, ZGB * 17, KH16]); • mentioned the use of crowdsourcing for future evaluation (e.g., [HLS16]).…”
Section: Methodsmentioning
confidence: 99%
“…• used crowdsourcing only for the pilot study (e.g., [TGH12]); • used crowdsourcing not for evaluation but for other purposes such as data collection (e.g., user generated layouts [KDMW16, KWD14, vHR08]); • evaluated graphics but no information visualization (e.g., [ • discussed previous crowdsourcing studies without using crowdsourcing (e.g., [AJB16, ZGB * 17, KH16]); • mentioned the use of crowdsourcing for future evaluation (e.g., [HLS16]).…”
Section: Methodsmentioning
confidence: 99%
“…A query and visualisation evaluation was conducted to assess the efficiency of the visualisation system in delivering visualised content to users ( Amri, Ltifi & Ayed, 2015 ; Hearst, Laskowski & Silva, 2016 ). This helps measure the performance of query sending from the GUI to the server and translates the results into a visual format.…”
Section: Methodsmentioning
confidence: 99%
“…There are different approaches to bringing usability or playability into development, from player-centered design models such as in Charles and colleagues [9] to heuristic evaluation such as in Desurwire and Wiberg [13]. One of the simplest ways to enhance usability is to introduce heuristics that can be used by every team member, without the need of usability professionals [28], [38], [50], [66]: they provide a cheap and flexible tool for finding issues early and to better understand usability and/or playability [53] that can be used by novice evaluators with good results [22]. Usability heuristics for games were first introduced by Malone [39] in 1980.…”
Section: Usability and Gamesmentioning
confidence: 99%