2020
DOI: 10.1162/dint_a_00028
|View full text |Cite
|
Sign up to set email alerts
|

A Generic Workflow for the Data FAIRification Process

Abstract: The FAIR guiding principles aim to enhance the Findability, Accessibility, Interoperability and Reusability of digital resources such as data, for both humans and machines. The process of making data FAIR (“FAIRification”) can be described in multiple steps. In this paper, we describe a generic step-by-step FAIRification workflow to be performed in a multidisciplinary team guided by FAIR data stewards. The FAIRification workflow should be applicable to any type of data and has been developed and used for “Brin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
71
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 83 publications
(82 citation statements)
references
References 18 publications
0
71
0
1
Order By: Relevance
“…Data FAIRification is a complex process, requiring several areas of expertise—domain knowledge on the specifics of the data generation process: physicochemical characterization, eco and human toxicology, exposure, material science, as well as knowledge from the computer science domain—data modelling, software architecture, software services and their secure deployment, data protection and licenses. The need for multidisciplinary team involvement is obvious and, together with strong data stewardship skills, is recommended by several guides and publications [ 6 , 32 ]. However, this is a heavyweight process and data providers are frequently reluctant and slow to adopt workflows, tools, and data formats outside of their daily routine.…”
Section: Discussionmentioning
confidence: 99%
“…Data FAIRification is a complex process, requiring several areas of expertise—domain knowledge on the specifics of the data generation process: physicochemical characterization, eco and human toxicology, exposure, material science, as well as knowledge from the computer science domain—data modelling, software architecture, software services and their secure deployment, data protection and licenses. The need for multidisciplinary team involvement is obvious and, together with strong data stewardship skills, is recommended by several guides and publications [ 6 , 32 ]. However, this is a heavyweight process and data providers are frequently reluctant and slow to adopt workflows, tools, and data formats outside of their daily routine.…”
Section: Discussionmentioning
confidence: 99%
“…MeSH annotations were collected from (Caniza, Romero & Paccanaro, 2015) (https: //paccanarolab.org/disease_similarity) and annotations were also obtained by NCBO annotator API (Noy et al, 2009) using the OMIM disease description. The data that was not yet in a Linked Data format were converted to RDF with a FAIRification process (Jacobsen et al, 2019). We kept the copies of the retrieved non-RDF datasets in our GitHub repository to prevent the data access issues that may arise if data sources are unavailable.…”
Section: Fairified Data Collectionmentioning
confidence: 99%
“…Second, the expertise required to achieve the objectives were identified. Conducting the FAIRification process requires a highly multidisciplinary team guided by FAIR data steward(s) [8].…”
Section: Step 1 -Identify Fairification Objectives and Expertisementioning
confidence: 99%
“…Also, research expertise is applicable throughout the table and is for simplicity not specified. The areas of expertise have been learned from a previous project [8], and further advanced here. Table 1.…”
Section: Step 1 -Identify Fairification Objectives and Expertisementioning
confidence: 99%