2019
DOI: 10.48550/arxiv.1902.00641
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

CodedPrivateML: A Fast and Privacy-Preserving Framework for Distributed Machine Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
9

Relationship

2
7

Authors

Journals

citations
Cited by 25 publications
(19 citation statements)
references
References 0 publications
0
19
0
Order By: Relevance
“…Canoe contributes to such approaches by proposing a generic system support that makes them applicable to scenarios based on neural network models. Other types of distributed machine learning system introduce complementary techniques aimed at model specializations [69,85,88], including when targeting edge nodes and IoT, as well as for addressing privacy concerns about data and model sharing [42,59,77]. Knowledge Transfer Technique.…”
Section: Related Workmentioning
confidence: 99%
“…Canoe contributes to such approaches by proposing a generic system support that makes them applicable to scenarios based on neural network models. Other types of distributed machine learning system introduce complementary techniques aimed at model specializations [69,85,88], including when targeting edge nodes and IoT, as well as for addressing privacy concerns about data and model sharing [42,59,77]. Knowledge Transfer Technique.…”
Section: Related Workmentioning
confidence: 99%
“…In Step 1, given any upper bound construction of R(p, m, n) (e.g., Strassen's construction) with rank R and tensor tuples a ∈ F R×p×m , b ∈ F R×p×n , and c ∈ F R×m×n , we pre-encode the inputs each into a list of R coded submatrices. 7 Ãi,vec…”
Section: Achievability Schemes For Secure Distributed Matrix Multipli...mentioning
confidence: 99%
“…Large scale distributed computing faces several modern challenges, in particular, to provide resiliency against stragglers, robustness against computing errors, security against Byzantine and eavesdropping adversaries, privacy of sensitive information, and to efficiently handle repetitive computation [1]- [7]. Coded computing is an emerging field that resolves these issues by introducing and developing new coding theoretic concepts, started focusing on straggler mitigation [8]-[10], then later extended to secure and private computation [6], [7], [11]- [14].…”
Section: Introductionmentioning
confidence: 99%
“…Another closely related line of work is concerned with individual samples' privacy; this is relevant in cases where the samples contain highly sensitive information and they need to be kept secret even from the ML model. Examples of privacy-preserving ML include works based on differential privacy such as [22][23][24] and privacy-preserving learning such as [25][26][27][28][29][30]. A major distinction between this line of work and machine unlearning is that samples do not need to be kept private in machine unlearning, but the requests of unlearning need to be honored.…”
Section: Related Workmentioning
confidence: 99%

Coded Machine Unlearning

Aldaghri,
Mahdavifar,
Beirami
2020
Preprint