2011 IEEE Seventh International Conference on eScience 2011
DOI: 10.1109/escience.2011.23
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Framework for Monitoring and Analyzing Quality of Data in Simulation Workflows

Abstract: Abstract-In recent years scientific workflows have been used for conducting data-intensive and long running simulations. Such simulation workflows have processed and produced different types of data whose quality has a strong influence on the final outcome of simulations. Therefore being able to monitor and analyze quality of this data during workflow execution is of paramount importance, as detection of quality problems will enable us to control the execution of simulations efficiently. Unfortunately, existin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…This facet entails properties such as featurerichness (i.e., how many dimensions are in the data) [101] and truthfulness (i.e., the extent to which the data corresponds with reality) [118]. Finally, data quality has a contextbased facet [119], which includes the ease with which it can be integrated with other data [34], the relevance of the data for the data consumer, as well as (proof of) data provenance (i.e., the origin of the data) [10], [13].…”
Section: Data Quality Assessmentmentioning
confidence: 99%
“…This facet entails properties such as featurerichness (i.e., how many dimensions are in the data) [101] and truthfulness (i.e., the extent to which the data corresponds with reality) [118]. Finally, data quality has a contextbased facet [119], which includes the ease with which it can be integrated with other data [34], the relevance of the data for the data consumer, as well as (proof of) data provenance (i.e., the origin of the data) [10], [13].…”
Section: Data Quality Assessmentmentioning
confidence: 99%
“…However, an important issue is how to scale the problem solving when the complex software detects some critical situations? We approach this question by using human-based workflows [13] and using high-level elasticity control language [11] to invoke humanbased services, when needed, e.g., when the quality of data is low. The following list shows an example of how to invoke human-based services within an analytics service: In this case, both software and humans are involved and human-based services (indicated by ServiceUnitType.HBS) are scaled out to examine the situations.…”
Section: Fig 5 Example Of Data Analytics Elasticity For Smart Citiesmentioning
confidence: 99%
“…In order to measure, monitor, and evaluate QoD metrics for FEM-based simulations, we utilize our extensible QoD Evaluation Framework for workflows developed in [8]. Generally, with this framework, we can determine QoD in a very flexible way: (i) platform and language independent metrics and interpretations can be invoked, (ii) separate metrics as well as metrics that are included in comprehensive algorithms (e.g.…”
Section: Qod Evaluation Frameworkmentioning
confidence: 99%
“…We consider QoD as a tuple of characteristics and goodness [8]. A characteristics of data will be analyzed without any simulation context, while the goodness of the characteristics of data will be evaluated with respect to the specific simulation context.…”
Section: Qod Metrics For Fem-based Simulationsmentioning
confidence: 99%
See 1 more Smart Citation