2021
DOI: 10.1155/2021/5542283
|View full text |Cite
|
Sign up to set email alerts
|

Differential Privacy Principal Component Analysis for Support Vector Machines

Abstract: In big data era, massive and high-dimensional data is produced at all times, increasing the difficulty of analyzing and protecting data. In this paper, in order to realize dimensionality reduction and privacy protection of data, principal component analysis (PCA) and differential privacy (DP) are combined to handle these data. Moreover, support vector machine (SVM) is used to measure the availability of processed data in our paper. Specifically, we introduced differential privacy mechanisms at different stages… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…The three algorithms were compared theoretically between DPSVD, AG [ 9 ], and DPPCA-SVM [ 26 ] summarized in Table 2 . Other ones have been compared by the DPPCA-SVM algorithm.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The three algorithms were compared theoretically between DPSVD, AG [ 9 ], and DPPCA-SVM [ 26 ] summarized in Table 2 . Other ones have been compared by the DPPCA-SVM algorithm.…”
Section: Methodsmentioning
confidence: 99%
“…Imtiaz and Sarwate [ 22 , 23 ] and Jiang et al [ 24 ] disturbed the matrix of covariance with Wishart noise and it guarantees that the perturbed matrix of covariance is positive semidefinite. Xu et al [ 25 ] and Huang et al [ 26 ] added symmetric Laplace noise to the matrix of covariance. Those methods above all generate the perturbed matrix of covariance by adding a noise matrix and then perform EVD to implement PCA.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In DP, the sensitivity Δf can be understood as the approximate adjacency degree of the two data sets for the function f Definition 3 (Laplace Mechanism [27]). Given a data set D, for a function f: D ⟶ R d with sensitivity Δf, the mechanism M provides (ε, 0)− DP satisfying:]…”
Section: Differential Privacymentioning
confidence: 99%