2021
DOI: 10.2196/21810
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning–Based Analysis of Encrypted Medical Data in the Cloud: Qualitative Study of Expert Stakeholders’ Perspectives

Abstract: Background Third-party cloud-based data analysis applications are proliferating in electronic health (eHealth) because of the expertise offered and their monetary advantage. However, privacy and security are critical concerns when handling sensitive medical data in the cloud. Technical advances based on “crypto magic” in privacy-preserving machine learning (ML) enable data analysis in encrypted form for maintaining confidentiality. Such privacy-enhancing technologies (PETs) could be counterintuitiv… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 49 publications
0
4
0
Order By: Relevance
“…After full-text screening, 90 (73.8%) records were excluded primarily due to the absence of concepts of trust or acceptance and a lack of factors related to trust or acceptance in the main text or data-containing figures. As a result, 32 (26.2%) papers and reports [ 7 , 9 - 12 , 15 , 16 , 19 , 21 , 31 - 53 ] were included in the data analysis.…”
Section: Resultsmentioning
confidence: 99%
“…After full-text screening, 90 (73.8%) records were excluded primarily due to the absence of concepts of trust or acceptance and a lack of factors related to trust or acceptance in the main text or data-containing figures. As a result, 32 (26.2%) papers and reports [ 7 , 9 - 12 , 15 , 16 , 19 , 21 , 31 - 53 ] were included in the data analysis.…”
Section: Resultsmentioning
confidence: 99%
“…To address these trust barriers, the literature discussed the importance of keeping AI systems updated by introducing new rules and cases along with routine performance assessments to enhance the accuracy of decisions made by AI-based medical devices. 73,78 Further regulations and legislation could also increase trust by ensuring the balance between innovation and patient safety and confirming that AI algorithms meet appropriate standards of clinical benefit. 79,80…”
Section: Resultsmentioning
confidence: 99%
“…This is especially the case if PETs, such as FE and homomorphic encryption, are novel for users and different from traditional (encryption) schemes that the user may have heard of or are familiar with. Furthermore, they could be perceived as counter-intuitive, as no existing real-world analogies exist [5]. Therefore, research on usable explanations of FE (and of PETs in general), which can contribute to usable privacy notices as part of consent forms, requires further work.…”
Section: Introductionmentioning
confidence: 99%