2019 IEEE International Symposium on Information Theory (ISIT) 2019
DOI: 10.1109/isit.2019.8849347
|View full text |Cite
|
Sign up to set email alerts
|

Private Inner Product Retrieval for Distributed Machine Learning

Abstract: In this paper, we argue that in many basic algorithms for machine learning, including support vector machine (SVM) for classification, principal component analysis (PCA) for dimensionality reduction, and regression for dependency estimation, we need the inner products of the data samples, rather than the data samples themselves.Motivated by the above observation, we introduce the problem of private inner product retrieval for distributed machine learning, where we have a system including a database of some fil… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 16 publications
(23 reference statements)
0
9
0
Order By: Relevance
“…1) Construction of the matrix G: The user constructs an L(n + m) × K matrix G with a structure similar to (10), where G 1 , . .…”
Section: B Case (Ii)mentioning
confidence: 99%
See 4 more Smart Citations
“…1) Construction of the matrix G: The user constructs an L(n + m) × K matrix G with a structure similar to (10), where G 1 , . .…”
Section: B Case (Ii)mentioning
confidence: 99%
“…. , 8 of V correspond respectively to the message indices in W, i.e., 5,8,11,2,4,7,10,18. Thus, the user constructs the permutation π such that π(5) = 9, π(8) = 10, π(11) = 11, π(2) = 12, π(4) = 13, π(7) = 14, π (10) = 15, π(18) = 16.…”
Section: Appendix Illustrative Examples Of the Gpc-pia Protocolmentioning
confidence: 99%
See 3 more Smart Citations