2022
DOI: 10.1186/s12864-022-08469-w
|View full text |Cite
|
Sign up to set email alerts
|

Secure tumor classification by shallow neural network using homomorphic encryption

Abstract: Background Disclosure of patients’ genetic information in the process of applying machine learning techniques for tumor classification hinders the privacy of personal information. Homomorphic Encryption (HE), which supports operations between encrypted data, can be used as one of the tools to perform such computation without information leakage, but it brings great challenges for directly applying general machine learning algorithms due to the limitations of operations supported by HE. In parti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 21 publications
(16 citation statements)
references
References 41 publications
0
16
0
Order By: Relevance
“…It shows how the minimum and maximum values of input of softmax vary as the training proceeds. The value increases in the order of two, which cannot be handled by previous approximation methods (Lee et al, 2022b;Hong et al, 2022;Jin et al, 2020). This shows that, with the previous approximation methods, it is hard to train a model as much as we want.…”
Section: Softmax Approximationmentioning
confidence: 97%
See 4 more Smart Citations
“…It shows how the minimum and maximum values of input of softmax vary as the training proceeds. The value increases in the order of two, which cannot be handled by previous approximation methods (Lee et al, 2022b;Hong et al, 2022;Jin et al, 2020). This shows that, with the previous approximation methods, it is hard to train a model as much as we want.…”
Section: Softmax Approximationmentioning
confidence: 97%
“…There are several works on the approximation of softmax function with polynomials (Lee et al, 2022b;Hong et al, 2022;Jin et al, 2020). However, all of these methods have low precision (See the Table 3 in the Appendix) and permit only a small domain of approximation, which is not desirable for training with many epochs.…”
Section: Softmax Approximationmentioning
confidence: 99%
See 3 more Smart Citations