2019
DOI: 10.3390/ijerph16050769
|View full text |Cite
|
Sign up to set email alerts
|

Merging Data Diversity of Clinical Medical Records to Improve Effectiveness

Abstract: Medicine is a knowledge area continuously experiencing changes. Every day, discoveries and procedures are tested with the goal of providing improved service and quality of life to patients. With the evolution of computer science, multiple areas experienced an increase in productivity with the implementation of new technical solutions. Medicine is no exception. Providing healthcare services in the future will involve the storage and manipulation of large volumes of data (big data) from medical records, requirin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 16 publications
(5 citation statements)
references
References 39 publications
0
5
0
Order By: Relevance
“…Thus, efforts to optimize preclinical PTSD models for reliability/reproducibility is essential for promoting mechanistic understanding of the disease and enhance their ability to serve as effective platforms for evaluating new and promising therapeutics (Zhang et al, 2019). Efforts to achieve methodological consensus within a preclinical PTSD model can also facilitate meta-analyses and the creation of metadata, which serve as powerful strategies in translational research (Helgheim et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…Thus, efforts to optimize preclinical PTSD models for reliability/reproducibility is essential for promoting mechanistic understanding of the disease and enhance their ability to serve as effective platforms for evaluating new and promising therapeutics (Zhang et al, 2019). Efforts to achieve methodological consensus within a preclinical PTSD model can also facilitate meta-analyses and the creation of metadata, which serve as powerful strategies in translational research (Helgheim et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…The design of an ETL process faces several challenges, including the following: (1) The ETL process should be able to process huge amounts of data at once [62]; (2) the ETL process should be repeatable-if the source data change, the ETL process needs to be rerun to process the source data (Observational Health Data Sciences and Informatics [OHDSI]) [63]; (3) expert-level anonymization methodologies might be integrated into ETL workflows whenever possible [61]; and (4) there is a need to check for loss of data and compromised data integrity. The latter was highlighted in a recent study, in which inaccurate cohort identification took place because erroneous vocabulary mappings of a common data model were used (eg, ETL programming bugs and errors not captured during the quality assurance stages) [64].…”
Section: Insightmentioning
confidence: 99%
“…Healthcare data integration is a vital investigation topic involving patient privacy issues and dealing with different information systems. There is already some work under this topic [1] and [2]; nevertheless, isolated data from a single source is insufficient. Data must be enriched by adding further information (metadata) and integrating it with other data sources.…”
Section: Introductionmentioning
confidence: 99%