Proceedings of the 2012 BELIV Workshop: Beyond Time and Errors - Novel Evaluation Methods for Visualization 2012
DOI: 10.1145/2442576.2442585
|View full text |Cite
|
Sign up to set email alerts
|

The importance of tracing data through the visualization pipeline

Abstract: Visualization research focuses either on the transformation steps necessary to create a visualization from data, or on the perception of structures after they have been shown on the screen. We argue that an end-to-end approach is necessary that tracks the data all the way through the required steps, and provides ways of measuring the impact of any of the transformations. By feeding that information back into the pipeline, visualization systems will be able to adapt the display to the data to be shown, the para… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…• for individual data chunks, quality metrics can be used to assess the quality of an element traveling the pipeline [41] (e.g., number of processing steps this data has already passed through); • for two subsequent data chunks, delta metrics can be computed at data level and at view level to measure the difference between them; • for the entirety of all data chunks having been processed so far, error metrics can be used to estimate the error of the current partial result as compared to the exact result (e.g., convergence of an iterative algorithm, error introduced by numerical approximations). These metrics may and should be used in conjunction.…”
Section: Subdivided Operatorsmentioning
confidence: 99%
“…• for individual data chunks, quality metrics can be used to assess the quality of an element traveling the pipeline [41] (e.g., number of processing steps this data has already passed through); • for two subsequent data chunks, delta metrics can be computed at data level and at view level to measure the difference between them; • for the entirety of all data chunks having been processed so far, error metrics can be used to estimate the error of the current partial result as compared to the exact result (e.g., convergence of an iterative algorithm, error introduced by numerical approximations). These metrics may and should be used in conjunction.…”
Section: Subdivided Operatorsmentioning
confidence: 99%