2019
DOI: 10.1002/cpe.5208
|View full text |Cite
|
Sign up to set email alerts
|

PPEM: Privacy‐preserving EM learning for mixture models

Abstract: Summary Privacy is becoming increasingly important in collaborative data analysis, especially those involving personal or sensitive information commonly arising from health and commercial settings. The aim of privacy preserving statistical algorithms is to allow inference to be drawn on the joint data without disclosing private data held by each party. This paper presents a privacy‐preserving expectation–maximization (PPEM) algorithm for carrying out maximum likelihood estimation of the parameters of mixture m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 43 publications
0
7
0
Order By: Relevance
“…We decided to choose the latter approach, as we deemed our search criteria to be broad enough to cover follow-up studies. We have found several follow-up papers, where these papers present an extension of their existing methods to: 1) solve other data partitioning problems [125,126]; 2) apply to more advanced data analysis algorithms [62,64]; 3) to include more complicated user scenarios [67,68]; 4) to conduct more experiments by using real-life datasets [25,59,96,109].…”
Section: Potential Limitationsmentioning
confidence: 99%
“…We decided to choose the latter approach, as we deemed our search criteria to be broad enough to cover follow-up studies. We have found several follow-up papers, where these papers present an extension of their existing methods to: 1) solve other data partitioning problems [125,126]; 2) apply to more advanced data analysis algorithms [62,64]; 3) to include more complicated user scenarios [67,68]; 4) to conduct more experiments by using real-life datasets [25,59,96,109].…”
Section: Potential Limitationsmentioning
confidence: 99%
“…Reference [39] introduces a circular transition protocol for secure summation, which is only secure with an honest-majority. Utilizing additive homomorphic encryption schemes, references [40]- [42] each proposed a cryptographybased secure summation protocol against a malicious majority. In [39], [40], [42] a master site is randomly selected to obtain the results of secure summations for A i , B i , C i , D i , n, update the parameters according to (12)(13)(15) and distribute them to all other sites.…”
Section: Preliminaries a Graph-based Semi-supervised Learning Amentioning
confidence: 99%
“…Utilizing additive homomorphic encryption schemes, references [40]- [42] each proposed a cryptographybased secure summation protocol against a malicious majority. In [39], [40], [42] a master site is randomly selected to obtain the results of secure summations for A i , B i , C i , D i , n, update the parameters according to (12)(13)(15) and distribute them to all other sites. Reference [41] proposed a protocol where every site initiates a secure summation as a sponsor and receives…”
Section: Preliminaries a Graph-based Semi-supervised Learning Amentioning
confidence: 99%
“…There is a substantial amount of work dealing with privacy in machine learning methods, see for instance [7][8][9][10][11][12][13][14]. Of particular interest to our work is [15][16][17][18][19][20][21] that also considers privacy of distributed EM. Among them, the privacy of the proposed solution in [21] is based on two party cryptographic computations that tend to be computationally heavy and time consuming.…”
Section: Introductionmentioning
confidence: 99%