2022
DOI: 10.1109/tpds.2021.3098467
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Federated Learning for Personalised Deep Neural Networks in Edge Computing

Abstract: Federated Learning (FL) is an emerging approach for collaboratively training Deep Neural Networks (DNNs) on mobile devices, without private user data leaving the devices. Previous works have shown that non-Independent and Identically Distributed (non-IID) user data harms the convergence speed of the FL algorithms. Furthermore, most existing work on FL measures global-model accuracy, but in many cases, such as user content-recommendation, improving individual User model Accuracy (UA) is the real objective. To a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
44
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 163 publications
(44 citation statements)
references
References 15 publications
0
44
0
Order By: Relevance
“…In this proposed protocol, Enc(x i ), Enc(y i ), and Enc(x i ⊕ y i ) are encrypted with Paillier cryptography [43]. In Paillier cryptography, message m ∈ Z n is encrypted by (3) with integers n = p•q, g = 1 + n mod n 2 , where p and q are prime numbers of about 3000 bits, r is a random number 0 < r < n ∈ Z * n 2 , and gcd(r, n) = 1. The public key is (n, g).…”
Section: Implementation Of the Comparison Protocol To Realize The B2b...mentioning
confidence: 99%
See 1 more Smart Citation
“…In this proposed protocol, Enc(x i ), Enc(y i ), and Enc(x i ⊕ y i ) are encrypted with Paillier cryptography [43]. In Paillier cryptography, message m ∈ Z n is encrypted by (3) with integers n = p•q, g = 1 + n mod n 2 , where p and q are prime numbers of about 3000 bits, r is a random number 0 < r < n ∈ Z * n 2 , and gcd(r, n) = 1. The public key is (n, g).…”
Section: Implementation Of the Comparison Protocol To Realize The B2b...mentioning
confidence: 99%
“…Blockchain is a digital ledger for record-keeping over peer-to-peer (P2P) networks [1,2]. It is decentralized and dispersed in nature with tamper-resistant and tamper-evident features [3][4][5]. Each peer stores a copy of the blockchain and verifies the validity of the stored data, such that no peers can tamper with the data.…”
Section: Introductionmentioning
confidence: 99%
“…Blockchain is a digital ledger for record-keeping [10,11]. It is an decentralized and dispersed in nature with tamper-resistant and tamper-evident features [12][13][14]. Google Inc. introduced the FL idea to solve privacy preservation and communication overhead Issues that emerged from combining data from several nodes and storing it in a centralized location [15,16].…”
Section: Introductionmentioning
confidence: 99%
“…Adaptive optimizers, such as Adam [21] learn the model parameters flexibly. Adam is less sensitive to hyperparameters, achieves a faster convergence, and is used in FL-based applications [22]. Adamax is a variant of Adam, and it is suitable for datasets with large spikes similar to that of the energy consumption dataset [21].…”
Section: Approachmentioning
confidence: 99%
“…However, SGD uses a fixed learning rate on model parameters and hence, it does not suit a dataset whose features occur at di↵erent frequencies. Adaptive optimization function, such as Adam [21] learns the model parameters in a flexible manner and is less sensitive to hyper parameters and can be utilized in federated learning applications [22]. We use Adamax, which is a class of Adam and it is based on the infinity norm (largest magnitude of the input vector), as it is suitable for datasets with sudden large swings similar to the energy consumption dataset [21].…”
Section: Appliance-level Energy Prediction Frameworkmentioning
confidence: 99%