2019
DOI: 10.1109/access.2019.2940052
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Privacy-Preserving Machine Learning for Blockchain Network

Abstract: A blockchain as a trustworthy and secure decentralized and distributed network has been emerged for many applications such as in banking, finance, insurance, healthcare and business. Recently, many communities in blockchain networks want to deploy machine learning models to get meaningful knowledge from geographically distributed large-scale data owned by each participant. To run a learning model without data centralization, distributed machine learning (DML) for blockchain networks has been studied. While sev… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
53
0
3

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 110 publications
(56 citation statements)
references
References 21 publications
0
53
0
3
Order By: Relevance
“…In LearningChain, they developed a differential privacy mechanism for the local gradient computing process to protect the privacy of individual data providers, and a l-nearest aggregation scheme to defend against Byzantine attacks in the global gradient's aggregation process. In the next year, Kim et al [72] pointed out that the LearningChain system has several limitations such as low computation efficiency, zero support on non-deterministic function computations, and weak privacy preservation. To revolve those issues in a systematic way, they developed an improved distributed machine learning model for permissioned blockchains.…”
Section: Blockchain For Data Securitymentioning
confidence: 99%
“…In LearningChain, they developed a differential privacy mechanism for the local gradient computing process to protect the privacy of individual data providers, and a l-nearest aggregation scheme to defend against Byzantine attacks in the global gradient's aggregation process. In the next year, Kim et al [72] pointed out that the LearningChain system has several limitations such as low computation efficiency, zero support on non-deterministic function computations, and weak privacy preservation. To revolve those issues in a systematic way, they developed an improved distributed machine learning model for permissioned blockchains.…”
Section: Blockchain For Data Securitymentioning
confidence: 99%
“…As such, the DPPDL further ensures fairness during the data download and upload processes of the model training. Differently, H. Kim et al in [132] take a new error-based aggregation rule for preventing attacks by adversarial nodes in the aggregation process. This rule can aggregate the nearest learning results utilizing low errors saved in the immutable ledger as a log.…”
Section: Download Global Modelmentioning
confidence: 99%
“…However, each access to the training data causes information leakage of the training data and thus incurs privacy loss from the overall privacy budget . To apply DPGAN and DPSGD to the distributed/decentralized settings, we follow recent work [30], [31], [32], [33] to conduct local gradient computation and calculate privacy on a per party-basis, where each party individually applies moments accountant [18] to keep track of the spent privacy budget. Each party repeats the local training process until the allocated privacy budget is used up.…”
Section: End Ifmentioning
confidence: 99%