As the modern world becomes increasingly digitized and interconnected, distributed signal processing has proven to be effective in processing its large volume of data. However, a main challenge limiting the broad use of distributed signal processing techniques is the issue of privacy in handling sensitive data. To address this privacy issue, we propose a novel yet general subspace perturbation method for privacy-preserving distributed optimization, which allows each node to obtain the desired solution while protecting its private data. In particular, we show that the dual variable introduced in each distributed optimizer will not converge in a certain subspace determined by the graph topology. Additionally, the optimization variable is ensured to converge to the desired solution, because it is orthogonal to this non-convergent subspace. We therefore propose to insert noise in the non-convergent subspace through the dual variable such that the private data are protected, and the accuracy of the desired solution is completely unaffected. Moreover, the proposed method is shown to be secure under two widely-used adversary models: passive and eavesdropping. Furthermore, we consider several distributed optimizers such as ADMM and PDMM to demonstrate the general applicability of the proposed method. Finally, we test the performance through a set of applications. Numerical tests indicate that the proposed method is superior to existing methods in terms of several parameters like estimated accuracy, privacy level, communication cost and convergence rate.
One major concern of distributed computation in networks is the privacy of the individual nodes. To address this privacy issue in the context of the distributed average consensus problem, we propose a general, yet simple solution that achieves privacy using additive secret sharing, a tool from secure multiparty computation. This method enables each node to reach the consensus accurately and obtains perfect security at the same time. Unlike differential privacy based approaches, there is no trade-off between privacy and accuracy. Moreover, the proposed method is computationally simple compared to other techniques in secure multiparty computation, and it is able to achieve perfect security of any honest node as long as it has one honest neighbour under the honest-but-curious model, without any trusted third party.
In many applications of wireless sensor networks, it is important that the privacy of the nodes of the network be protected. Therefore, privacy-preserving algorithms have received quite some attention recently. In this paper, we propose a novel convex optimizationbased solution to the problem of privacy-preserving distributed average consensus. The proposed method is based on the primal-dual method of multipliers (PDMM), and we show that the introduced dual variables of the PDMM will only converge in a certain subspace determined by the graph topology and will not converge in the orthogonal complement. These properties are exploited to protect the private data from being revealed to others. More specifically, the proposed algorithm is proven to be secure for both passive and eavesdropping adversary models. Finally, the convergence properties and accuracy of the proposed approach are demonstrated by simulations which show that the method is superior to the state-of-the-art.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.