2017
DOI: 10.1515/popets-2017-0053
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Distributed Linear Regression on High-Dimensional Data

Abstract: Abstract:We propose privacy-preserving protocols for computing linear regression models, in the setting where the training dataset is vertically distributed among several parties. Our main contribution is a hybrid multi-party computation protocol that combines Yao's garbled circuits with tailored protocols for computing inner products. Like many machine learning tasks, building a linear regression model involves solving a system of linear equations. We conduct a comprehensive evaluation and comparison of diffe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
125
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
3

Relationship

1
9

Authors

Journals

citations
Cited by 168 publications
(126 citation statements)
references
References 53 publications
1
125
0
Order By: Relevance
“…Privacy-Preserving Machine Learning. Another relevant line of work is privacy-preserving machine learning [24], [19], [11], [10], [14], [28], [5], [6], [1]. Mohassel and Zhang [28] present efficient protocols for training linear regression, logistic regression, and neural networks in a privacy-preserving manner.…”
Section: Related Workmentioning
confidence: 99%
“…Privacy-Preserving Machine Learning. Another relevant line of work is privacy-preserving machine learning [24], [19], [11], [10], [14], [28], [5], [6], [1]. Mohassel and Zhang [28] present efficient protocols for training linear regression, logistic regression, and neural networks in a privacy-preserving manner.…”
Section: Related Workmentioning
confidence: 99%
“…In order to execute queries and compute statistics on distributed datasets, multiple decentralized solutions [10], [12], [14], [22], [23], [24], [25] rely on techniques that have a high expressive power, such as secret sharing and garbled circuits. These solutions are often flexible in the computations they offer but usually assume (a) honest-but-curious computing parties and (b) no collusion or a 2-party model.…”
Section: Related Workmentioning
confidence: 99%
“…For the large scale data sets needed for most machine learning applications, it is not feasible to perform training across private data sets as a generic many-party computation. Instead, hybrid approaches have been designed that combine MPC with homomorphic encryption (Nikolaenko et al, 2013b;Gascón et al, 2017) or develop custom protocols to perform secure arithmetic operations efficiently (Mohassel and Zhang, 2017). These approaches can scale to data sets containing many millions of elements.…”
Section: Introductionmentioning
confidence: 99%