2021
DOI: 10.3390/app11167360
|View full text |Cite
|
Sign up to set email alerts
|

Privacy Preserving Classification of EEG Data Using Machine Learning and Homomorphic Encryption

Abstract: Data privacy is a major concern when accessing and processing sensitive medical data. A promising approach among privacy-preserving techniques is homomorphic encryption (HE), which allows for computations to be performed on encrypted data. Currently, HE still faces practical limitations related to high computational complexity, noise accumulation, and sole applicability the at bit or small integer values level. We propose herein an encoding method that enables typical HE schemes to operate on real-valued numbe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 24 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…Popescu et al [73] developed a privacy-preserving method for classifying EEG data using homomorphic encryption (HE) [74] and machine learning. This method introduces an encoding system that adapts typical HE schemes to handle real-valued numbers efficiently, tackling the high computational demand and noise accumulation issues inherent in HE applications.…”
Section: Utilizing Large-scale Eeg: Overcoming Privacy Challenges In ...mentioning
confidence: 99%
“…Popescu et al [73] developed a privacy-preserving method for classifying EEG data using homomorphic encryption (HE) [74] and machine learning. This method introduces an encoding system that adapts typical HE schemes to handle real-valued numbers efficiently, tackling the high computational demand and noise accumulation issues inherent in HE applications.…”
Section: Utilizing Large-scale Eeg: Overcoming Privacy Challenges In ...mentioning
confidence: 99%
“…This limits the set and number of transformations applicable to the data and requires the use of approximations for more complex operations (e.g., HE-ReLU is the polynomial approximation of the ReLU function (Yue et al, 2021b)). This also significantly increases the computational time needed to process encrypted text compared to plaintext by several orders of magnitude (Popescu et al, 2021).…”
Section: Homomorphic Encryptionmentioning
confidence: 99%
“…However, the application of AI techniques can cause privacy leakage and security risks since solutions based on AI are reliant on a large number of training samples commonly belonging to different users. To make effective and safe use of data distributed in different locations while satisfying privacy requirements, some approaches have incorporated privacy-preserving techniques into image processing [5][6][7][8][9]14,33]. For example, Fagbohungbe et al [5] proposed a secure intelligent computing framework for image classification based on deep learning and edge computing.…”
Section: Related Workmentioning
confidence: 99%
“…Existing approaches to train machine learning models in privacy-preserving settings mainly rely on secure multi-party computation [11], fully or partially homomorphic encryption [12][13][14][15] and secret sharing [16,17]. Most of these approaches have several limitations: (i) First, in traditional privacy-preserving machine learning schemes, the non-linear functions are hard to support using common cryptographic techniques, such as homomorphic encryption and boolean circuit.…”
Section: Introductionmentioning
confidence: 99%