Proceedings of the 19th ACM Conference on Embedded Networked Sensor Systems 2021
DOI: 10.1145/3485730.3485946
|View full text |Cite
|
Sign up to set email alerts
|

FedDL

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 59 publications
(22 citation statements)
references
References 42 publications
0
22
0
Order By: Relevance
“…Deep Neural Networks (DNNs) have shown remarkable performance in various applications such as computer vision [43,75], natural language understanding [56,76], human activity recognition [53,62], and etc. Despite their strong capabilities, it usually requires a huge volume of training data, which is usually produced and stored on edge devices (e.g., mobile phones).…”
Section: Introductionmentioning
confidence: 99%
“…Deep Neural Networks (DNNs) have shown remarkable performance in various applications such as computer vision [43,75], natural language understanding [56,76], human activity recognition [53,62], and etc. Despite their strong capabilities, it usually requires a huge volume of training data, which is usually produced and stored on edge devices (e.g., mobile phones).…”
Section: Introductionmentioning
confidence: 99%
“…Wu et al [ 36 ] proposed a generative convolutional autoencoder network and fine-tuned the model parameters of higher layers of generative convolutional autoencoder to obtain more accurate personalized models. Another way to address statistical heterogeneity is user clustering, which enables the FL system to capture the underlying relationships between users as studied by Ouyang et al [ 34 ] and Tu et al [ 35 ]. By clustering users, those in the same group can collaboratively learn personalized models.…”
Section: Resultsmentioning
confidence: 99%
“…Ouyang et al [ 34 ] designed a learned cluster structure for their system, which allows the use of clusterwise straggler dropout and correlation-based node selection to reduce communication overhead. Moreover, Tu et al [ 35 ] proposed FedDL, which can reduce the number of parameters communicated between users and the server through a dynamic layerwise sharing scheme. This is because only the lower layers of local models need to be uploaded to the server for global training.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The main drawback of these approaches is that they consider a convex objective function, which is not suitable for complex HAR models based on deep learning. Recently, a few works proposed solutions based on Federated Clustering [37,38,13]. The goal of Federated Clustering is to create specialized global models (server-side) by grouping users that perform activities in a similar way.…”
Section: Related Workmentioning
confidence: 99%