2020
DOI: 10.1109/twc.2020.3021177
|View full text |Cite
|
Sign up to set email alerts
|

Joint Parameter-and-Bandwidth Allocation for Improving the Efficiency of Partitioned Edge Learning

Abstract: To leverage data and computation capabilities of mobile devices, machine learning algorithms are deployed at the network edge for training artificial intelligence (AI) models, resulting in the new paradigm of edge learning. In this paper, we consider the framework of partitioned edge learning for iteratively training a large-scale model using many resource-constrained devices (called workers). To this end, in each iteration, the model is dynamically partitioned into parametric blocks, which are downloaded to w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
60
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 59 publications
(60 citation statements)
references
References 21 publications
(50 reference statements)
0
60
0
Order By: Relevance
“…One is the number of communication rounds required for model convergence and the other is the per-round latency. According to [13], the overall learning latency minimization is equivalent to separately minimizing the per-round latency, as the number of rounds till model convergence is irrelevant to the parameter and bandwidth allocation. In the sequel, the per-round latency minimization problem is formulated.…”
Section: Problem Formulation and Simplificationmentioning
confidence: 99%
See 3 more Smart Citations
“…One is the number of communication rounds required for model convergence and the other is the per-round latency. According to [13], the overall learning latency minimization is equivalent to separately minimizing the per-round latency, as the number of rounds till model convergence is irrelevant to the parameter and bandwidth allocation. In the sequel, the per-round latency minimization problem is formulated.…”
Section: Problem Formulation and Simplificationmentioning
confidence: 99%
“…where the notation follows that in Problem (P3). Two useful Lemmas for relating Problems (P3) and (P4) are given as follows, whose proofs can be referred to Appendices F and G in [13].…”
Section: Joint Parameter Allocation and Bandwidth Allocationmentioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, researchers have designed energy efficient RRM techniques to tackle the challenge of executing a complex learning task at energy constrained devices in a FEEL system [18]- [20]. Recently, researchers have also explored the efficient implementation of parameter-server training over wireless channels, resulting in a framework called partitioned edge learning (PARTEL) [21]. Let parameter allocation refers to the system operation that to balance computation loads, the server divides the model into parametric blocks of variable lengths and allocate them to devices for separate training.…”
Section: Introductionmentioning
confidence: 99%