2013
DOI: 10.1109/tvcg.2013.126
|View full text |Cite
|
Sign up to set email alerts
|

A Systematic Review on the Practice of Evaluating Visualization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
230
1
1

Year Published

2014
2014
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 284 publications
(233 citation statements)
references
References 69 publications
(103 reference statements)
1
230
1
1
Order By: Relevance
“…As with cartography, there is increasing emphasis on user-centered evaluation in the related fields of information visualization and scientific visualization as the essential mechanism for determining whether an interactive design 'works' (e.g. Barkhuus & Rode, 2007;Borkin et al, 2011;Isenberg, Isenberg, Chen, Sedlmair, & Möller, 2013;Lam, Bertini, Isenberg, Plaisant, & Carpendale, 2012). But, which empirical methods should we use, and at what times during design?…”
Section: Introduction: Whither User Studies In Cartography?mentioning
confidence: 99%
“…As with cartography, there is increasing emphasis on user-centered evaluation in the related fields of information visualization and scientific visualization as the essential mechanism for determining whether an interactive design 'works' (e.g. Barkhuus & Rode, 2007;Borkin et al, 2011;Isenberg, Isenberg, Chen, Sedlmair, & Möller, 2013;Lam, Bertini, Isenberg, Plaisant, & Carpendale, 2012). But, which empirical methods should we use, and at what times during design?…”
Section: Introduction: Whither User Studies In Cartography?mentioning
confidence: 99%
“…Tasks could include: specific fact-finding (e.g., finding someone's phone number), extended fact-finding (e.g., finding books by the same author), open-ended browsing (e.g., identifying any new work on voice recognition from Japan), and exploration (e.g., finding out about your family history from an online archive) (Shneiderman, 1986, p. 512). For more browsingoriented tasks, such as exploration, then evaluation may go beyond simply dealing with ranked lists of results and assessing visualisations (Isenberg et al, 2013).…”
Section: Tasks and Topicsmentioning
confidence: 99%
“…This is not applicable in our case, since we use the same rendering for both volumes. Instead, we adopt two frequently applied evaluation methods from information visualisation [29], algorithm performance (left green arrow) and qualitative inspection (right green arrow) Instead, we employ evaluation measures from information visualisation [29] and use the two most prevalent evaluation methodologies in the visualisation community: Qualitative Result Inspection (QRI) and Algorithm Performance (AP). QRI is a method where a certain visualisation methodology is used to produce an image that demonstrates how something that could not be seen before can now be seen with the new method.…”
Section: Comparison Of 3d Densities: Space-time Density and Stacked Dmentioning
confidence: 99%
“…QRI is a method where a certain visualisation methodology is used to produce an image that demonstrates how something that could not be seen before can now be seen with the new method. An extensive review of information visualisation studies [29] found that QRI was used in 95 % of all cases. An example of this methodology used for 2D trajectory density can be found in [49].…”
Section: Comparison Of 3d Densities: Space-time Density and Stacked Dmentioning
confidence: 99%
See 1 more Smart Citation