With the advance of machine learning and the Internet of Things (IoT), security and privacy have become critical concerns in mobile services and networks. Transferring data to a central unit violates the privacy of sensitive data. Federated learning mitigates this need to transfer local data by sharing model updates only. However, privacy leakage remains an issue. This paper proposes xMK‐CKKS, an improved version of the MK‐CKKS multi‐key homomorphic encryption protocol, to design a novel privacy‐preserving federated learning scheme. In this scheme, model updates are encrypted via an aggregated public key before sharing with a server for aggregation. For decryption, a collaboration among all participating devices is required. Our scheme prevents privacy leakage from publicly shared model updates in federated learning and is resistant to collusion between
k
<
N
−
1 participating devices and the server. The evaluation demonstrates that the scheme outperforms other innovations in communication and computational cost while preserving model accuracy.
Perovskite Solar Cells
In article number http://doi.wiley.com/10.1002/solr.202000001, Jingjing Chang and co‐workers demonstrate a novel approach where a short‐period deep‐ultraviolet (DUV) photoactivation process is employed to modify SnO2 electron transport layers to achieve all‐inorganic perovskite solar cells with high efficiency exceeding 15% with good stability. The DUV treatment induces better energy level alignment, more ordered crystal growth, and reduces interface stress related defects.
With the advance of machine learning and the internet of things (IoT), security and privacy have become key concerns in mobile services and networks. Transferring data to a central unit violates privacy as well as protection of sensitive data while increasing bandwidth demands. Federated learning mitigates this need to transfer local data by sharing model updates only. However, data leakage still remains an issue. In this paper, we propose xMK-CKKS, a multi-key homomorphic encryption protocol to design a novel privacy-preserving federated learning scheme. In this scheme, model updates are encrypted via an aggregated public key before sharing with a server for aggregation. For decryption, collaboration between all participating devices is required. This scheme prevents privacy leakage from publicly shared information in federated learning, and is robust to collusion between k < N − 1 participating devices and the server. Our experimental evaluation demonstrates that the scheme preserves model accuracy against traditional federated learning as well as secure federated learning with homomorphic encryption (MK-CKKS, Paillier) and reduces computational cost compared to Paillier based federated learning. The average energy consumption is 2.4 Watts, so that it is suited to IoT scenarios.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.