2020
DOI: 10.1162/dint_a_00033
|View full text |Cite
|
Sign up to set email alerts
|

FAIR Computational Workflows

Abstract: Computational workflows describe the complex multi-step methods that are used for data collection, data preparation, analytics, predictive modelling, and simulation that lead to new data products. They can inherently contribute to the FAIR data principles: by processing data according to established metadata; by creating metadata themselves during the processing of data; and by tracking and recording data provenance. These properties aid data quality assessment and contribute to secondary data usage. Moreover,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
122
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 151 publications
(137 citation statements)
references
References 47 publications
0
122
0
2
Order By: Relevance
“…Software is often built using other software. This is especially obvious for software that implements multi-step processes to coordinate multiple tasks and their data dependencies, which are usually referred to as workflows [5,6]. Generally, all software applications that are not written completely from scratch are of a composite nature that easily leads to complex dependencies.…”
Section: Software Is Not Datamentioning
confidence: 99%
See 1 more Smart Citation
“…Software is often built using other software. This is especially obvious for software that implements multi-step processes to coordinate multiple tasks and their data dependencies, which are usually referred to as workflows [5,6]. Generally, all software applications that are not written completely from scratch are of a composite nature that easily leads to complex dependencies.…”
Section: Software Is Not Datamentioning
confidence: 99%
“…Thus, interoperability for software can be considered both for individual objects, which are the final product of a digital stack, and as part of broader digital ecosystems, which includes complex processes and workflows as well as their interaction [6,54,55]. Different pieces of software can also work together independent of programming languages, operating systems and specific hardware requirements through the use of APIs and/or other communication protocols.…”
Section: Interoperabilitymentioning
confidence: 99%
“…Workflows and input/output data can be publicly referenced [321,322] on the Workflow4metabolomics platform, thus enabling fully reproducible research. By using workflow systems, the reuse and reprocessing of data sets is greatly encouraged, as well as the tracking of data provenance [323]. This way, workflows help to boost the FAIR principles that were shaped for data [324].…”
Section: R-packages For Metabolomicsmentioning
confidence: 99%
“…55 The need for data sharing and repositories has also become more apparent via technical innovations such as machine and deep learning, as these require large pooled data sets to generate data-driven clinical decision models. [56][57][58] This need particularly has accelerated the FAIR data principles' integration in radiation oncology for structured machine readability (ie, index-ability/search-ability) and annotated data curation, as opposed to data qua unrefined data, are imperative. These principles are structurally reflected and supported by recent investment in shared data infrastructure, such as the NIH Strategic Plan for Data Science and Data Commons Framework.…”
Section: Data Sharing: Fairness In Data Accessibilitymentioning
confidence: 99%