2021 IEEE 37th International Conference on Data Engineering (ICDE) 2021
DOI: 10.1109/icde51399.2021.00039
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Federated-Learning Model Debugging

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 30 publications
(8 citation statements)
references
References 14 publications
0
8
0
Order By: Relevance
“…As shown in Table 2, both strategies select 3 identical clients (2,6,18). A comparison of the other 2 clients selected by the two strategies shows that our selection of clients 15 and 16 contains data samples from all categories, while the clients selected by DICE (i.e., 0 and 1) omit these rare categories (i.e., 7, 8, and 9).…”
Section: ) Rsmentioning
confidence: 97%
See 2 more Smart Citations
“…As shown in Table 2, both strategies select 3 identical clients (2,6,18). A comparison of the other 2 clients selected by the two strategies shows that our selection of clients 15 and 16 contains data samples from all categories, while the clients selected by DICE (i.e., 0 and 1) omit these rare categories (i.e., 7, 8, and 9).…”
Section: ) Rsmentioning
confidence: 97%
“…However, the ensuing data privacy issues pose new challenges for model developers [1]. As a result, scholars have proposed Federated Learning (FL) -a distributed computational training method designed to protect user privacy [2]. FL takes full advantage of the abundant data resources and computational power on each device, with the server as the coordination center.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, it limits machine learning's capability to deal with applications where data has been isolated across different organizations and data privacy has been emphasized [1], [2], e.g., user's private pictures and videos [3]- [5] captured by mobile phone as well as social relationships [6] should not be leaked during model training. Federated learning (FL) [7]- [10] is one emerging technology, which enables multiple parties to collaboratively train a machine learning model by iteratively exchanging model parameters between these parties and a centralized server, meanwhile keeping their datasets private. There are three types of FL methods according to the distribution of data, which are horizontal federated learning (HFL) [7], [11], vertical federated learning (VFL) [12], [13], and federated transfer learning (FTL) [14] respectively.…”
Section: Introductionmentioning
confidence: 99%
“…mobile edge computing (MEC) scenarios due to the limitation of data privacy and decentralization. As a popular privacy-preserving and distributed AI paradigm, Federated learning (FL) [21,32,[34][35][36]38] is promising to become the infrastructure for AI applications in IoT [3,12,16,24,29,38] and MEC [11,18,19,33,34] scenarios. By dispatching a global model to multiple clients for local training and then aggregating the trained models, FL realizes distributed model training without data sharing, which also greatly reduces the risk of data privacy leakage.…”
Section: Introductionmentioning
confidence: 99%