2020 IEEE 19th International Symposium on Network Computing and Applications (NCA) 2020
DOI: 10.1109/nca51143.2020.9306745
|View full text |Cite
|
Sign up to set email alerts
|

Federated vs. Centralized Machine Learning under Privacy-elastic Users: A Comparative Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 45 publications
(19 citation statements)
references
References 15 publications
2
17
0
Order By: Relevance
“…Analytical calculations are derived to capture the bandwidth and consumed energy of each scheme along with the way the involved energy costs are broken down into the system's stakeholders; thus, revealing each scheme's benefits and emerging trade-offs. This study extends our previous work [12] on the CL-FL comparison by introducing EL as an alternative distributed ML scheme; we extend the system model to allow for edge node utilization and shed light on the resources consumption per-stakeholder. Our theoretical analysis and numerical results recommend for ML workloads an edge node deployment at a regional level, closer to the cloud.…”
Section: Introductionmentioning
confidence: 61%
See 1 more Smart Citation
“…Analytical calculations are derived to capture the bandwidth and consumed energy of each scheme along with the way the involved energy costs are broken down into the system's stakeholders; thus, revealing each scheme's benefits and emerging trade-offs. This study extends our previous work [12] on the CL-FL comparison by introducing EL as an alternative distributed ML scheme; we extend the system model to allow for edge node utilization and shed light on the resources consumption per-stakeholder. Our theoretical analysis and numerical results recommend for ML workloads an edge node deployment at a regional level, closer to the cloud.…”
Section: Introductionmentioning
confidence: 61%
“…setting. The ML task terminates when all data is depleted, since our previous work in [12] has shown that utilizing the complete dataset once (essentially one epoch) with the above-mentioned model, suffices for over 75% accuracy for FL and over 80% for CL.…”
Section: Numerical Resultsmentioning
confidence: 99%
“…The simulation environment was set-up in a single desktop machine with the following characteristics: Intel Core i7-10700 CPU @ 2.9 GHz, 64-bit, RAM 16 GB, OS Windows 10. We have chosen the following scenario, based on our previous work [19];…”
Section: Simulation Environment and Resultsmentioning
confidence: 99%
“…Learning Architecture. ML algorithms are generally designed with a centralized or Standalone Approach (SA) [106] in which the training data is fed to a centralized machine where training and decisions are made. However, data is generally scattered and massive for network applications, whereas the centralized approach requires massive computational resources.…”
Section: 24mentioning
confidence: 99%