2021
DOI: 10.1016/j.ejmp.2021.04.004
|View full text |Cite
|
Sign up to set email alerts
|

Enterprise imaging and big data: A review from a medical physics perspective

Abstract: In recent years enterprise imaging (EI) solutions have become a core component of healthcare initiatives, while a simultaneous rise in big data has opened up a number of possibilities in how we can analyze and derive insights from large amounts of medical data. Together they afford us a range of opportunities that can transform healthcare in many fields. This paper provides a review of recent developments in EI and big data in the context of medical physics. It summarizes the key aspects of EI and big data in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(12 citation statements)
references
References 89 publications
(120 reference statements)
0
12
0
Order By: Relevance
“…Various techniques can be employed to address the lack of data, such as, data augmentation [34,37]. Additional realistic data could be generated through GANs using the latent data distributions [13,34,37]. LRP pruning was shown to provide good results for transfer learning with small datasets [56].…”
Section: Data-centric Xaimentioning
confidence: 99%
See 1 more Smart Citation
“…Various techniques can be employed to address the lack of data, such as, data augmentation [34,37]. Additional realistic data could be generated through GANs using the latent data distributions [13,34,37]. LRP pruning was shown to provide good results for transfer learning with small datasets [56].…”
Section: Data-centric Xaimentioning
confidence: 99%
“…The DNN model or the algorithm architecture also depends on hyperparameters that define the performance of the trained model [12]. Neural networks require a lot of training data and time before they can produce acceptable results but are like a 'black-box' providing little or no insight into the decision-making process [13]. The model opaqueness makes it hard even for the model developers to understand how the model is functioning [14].…”
Section: Introductionmentioning
confidence: 99%
“…DSS systems often act as "black boxes", thus preventing the necessary confidence in their use from being built. Further research in the field of explainable AI (XAI) is mandatory to match the requirement to have DSS systems with high performance, which are generally complex and barely interpretable, with the need of providing responses that can be explained to patients [18,36,37,38,39].…”
Section: Current Limitations To the Use Of Ai-based Solutions In Clinical Workflowsmentioning
confidence: 99%
“…Additionally, Medical Physicists could act as facilitators in the current research challenge of making AI-models explainable, which is a fundamental requirement to make them acceptable as clinical support tools [36]. They are naturally the intermediary professionals between clinicians and technology, so they can help set up the appropriate dictionaries to translate machine information into human (clinical) language and vice versa, thus globally supporting XAI research.…”
Section: The Role Of Medical Physicistsmentioning
confidence: 99%
“…Last but not least, the review of McCarthy et al [12] provides insights into recent enterprise imaging solutions applied to medical physics and healthcare settings. The rise in big data has opened up numerous opportunities for the application of enterprise imaging solutions to big data issues of healthcare.…”
Section: The Pillars Of Ai Knowledge For Mpsmentioning
confidence: 99%