2017
DOI: 10.1109/tnnls.2016.2597444
|View full text |Cite
|
Sign up to set email alerts
|

Fully Decentralized Semi-supervised Learning via Privacy-preserving Matrix Completion

Abstract: Distributed learning refers to the problem of inferring a function when the training data are distributed among different nodes. While significant work has been done in the contexts of supervised and unsupervised learning, the intermediate case of Semi-supervised learning in the distributed setting has received less attention. In this paper, we propose an algorithm for this class of problems, by extending the framework of manifold regularization. The main component of the proposed algorithm consists of a fully… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 46 publications
(19 citation statements)
references
References 48 publications
0
19
0
Order By: Relevance
“…Therefore, an independent cloud server outside the sites is necessary. Reference [35] proposed a distributed matrix completion algorithm and used it to perform a distributed semisupervised manifold regularization. During the matrix completion, each site exchanges inter-site data similarities with neighbors through privacy-preserving similarity computation protocols.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, an independent cloud server outside the sites is necessary. Reference [35] proposed a distributed matrix completion algorithm and used it to perform a distributed semisupervised manifold regularization. During the matrix completion, each site exchanges inter-site data similarities with neighbors through privacy-preserving similarity computation protocols.…”
Section: Related Workmentioning
confidence: 99%
“…During the matrix completion, each site exchanges inter-site data similarities with neighbors through privacy-preserving similarity computation protocols. According to [36], a distance-recoverable protocol is not secure against malicious adversaries, which makes the method in [35] only suitable against a semi-honest adversary. Reference [37] assumed a scenario where additional nonprivate data are available and provided a learning model to improve the accuracy of a differential-private classifier using both private and non-private data.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, as the LLDs cannot be reconstructed, the contents of the original speech signals are protected. Recently, a decentralized SSL paradigm was proposed in [137], in which privacy-preserving matrix completion algorithms are used, so that only learned knowledge is transferred between different clients, while the raw data are incommutable. However, as these approaches cannot fully guarantee client security and privacy or maintain the original performance, continued research addressing privacy concerns is required.…”
Section: Conclusion and Challenges For Future Workmentioning
confidence: 99%
“…In this work, we are exploring the use of compression in a framework for distributed classification. The main motivation of this work is the optimization of the distributed machine learning algorithm when computational and communication costs are to be preserved as is the case in, e.g., resource-constrained scenarios (e.g., edge machine learning [43], smart sensing and privacy-preserving algorithms [8]). Broadly, the objective for compression can be formulated as…”
Section: Introductionmentioning
confidence: 99%