2021
DOI: 10.1109/tgcn.2021.3096884
|View full text |Cite
|
Sign up to set email alerts
|

Energy-Efficient Topology Construction via Power Allocation for Decentralized Learning via Smart Devices With Edge Computing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 44 publications
0
4
0
Order By: Relevance
“…We can observe that an one-trip communication from leaf nodes to the root and back to leaf nodes is necessary for the centralized server-based model fine-tuning. In spite of the simplicity of the centralized server-based model fine-tuning, the entire aggregation tree easily suffer from communication and computational bottlenecks that happen to each member node [24], especially in a mobile social network, where the large-scale quantity of news and the rapid quantity of data are two critical characteristics of MSNs. Moreover, to secure the user data from the model inversion attacks that reproduce the data for model fine-tuning [12,14,15], some security-intensive applications incorporate random noises into model fine-tuning [6,12].…”
Section: Online Model Fine-tuning and Ensemble-based Model Inferencementioning
confidence: 99%
“…We can observe that an one-trip communication from leaf nodes to the root and back to leaf nodes is necessary for the centralized server-based model fine-tuning. In spite of the simplicity of the centralized server-based model fine-tuning, the entire aggregation tree easily suffer from communication and computational bottlenecks that happen to each member node [24], especially in a mobile social network, where the large-scale quantity of news and the rapid quantity of data are two critical characteristics of MSNs. Moreover, to secure the user data from the model inversion attacks that reproduce the data for model fine-tuning [12,14,15], some security-intensive applications incorporate random noises into model fine-tuning [6,12].…”
Section: Online Model Fine-tuning and Ensemble-based Model Inferencementioning
confidence: 99%
“…Aggregating: The most common aggregation method is majority voting for classification tasks (selecting the class with the most votes). [16] The base model of random forest is decision tree, thus, when build the model, we need to consider hyperparemeters like splitting criteria, minimum samples in leaf, etc. into account.…”
Section: Random Forestmentioning
confidence: 99%
“…Although context-agnostic methods can always be applied to improve the energy efficiency and the utilization rate of resources, an essential and unique feature of decentralized learning is the possibility of adjusting the communication network topology. In [29], the authors select a subset of the available links to perform model broadcasting so as to minimize the transmission power levels. A constraint on the minimum number of links required to guarantee the convergence of learning is added.…”
Section: Context-aware DL Optimizationmentioning
confidence: 99%