2021
DOI: 10.1016/j.procs.2021.01.337
|View full text |Cite
|
Sign up to set email alerts
|

Anonymization as homeomorphic data space transformation for privacy-preserving deep learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…As mentioned above, a promising solution for privacy preservation is homomorphic encryption, but there are also other techniques for data encryption and/or anonymization. For example, on the basis of homomorphic data space transformation, Girka et al [38] proposed an anonymization algorithm to protect data, while still allowing neural network training. They analyzed the effects that this method has on the neural network perfor-mance.…”
Section: Review Of Privacy-preserving Ai In Industrial Applicationsmentioning
confidence: 99%
“…As mentioned above, a promising solution for privacy preservation is homomorphic encryption, but there are also other techniques for data encryption and/or anonymization. For example, on the basis of homomorphic data space transformation, Girka et al [38] proposed an anonymization algorithm to protect data, while still allowing neural network training. They analyzed the effects that this method has on the neural network perfor-mance.…”
Section: Review Of Privacy-preserving Ai In Industrial Applicationsmentioning
confidence: 99%
“…In [13], the new algorithm has been suggested for protecting sensitive data (against adversarial attacks) used for training and testing intelligent systems based on deep neural networks. It can be used as an important feature of the digital immune system and as a complementary alternative to a digital vaccination concept.…”
Section: Immune Project Puzzlesmentioning
confidence: 99%
“…Sensitive information can be more difficult to distinguish from non-sensitive information thanks to privacy-preserving measures that aim to prevent the leakage of sensitive information. But they don't rule out the possibility of finding inference laws [10] [11]. Therefore, in recent years, experts have concentrated their attention on association norms that protect personal information.…”
Section: Introductionmentioning
confidence: 99%