2020
DOI: 10.1109/lwc.2020.3001994
|View full text |Cite
|
Sign up to set email alerts
|

Fair Computation Efficiency Scheduling in NOMA-Aided Mobile Edge Computing

Abstract: Splitting the inference model between device, edge server, and cloud can improve the performance of EI greatly. Additionally, the non-orthogonal multiple access (NOMA), which is the key supporting technologies of B5G/6G, can achieve massive connections and high spectrum efficiency. Motivated by the benefits of NOMA, integrating NOMA with model split in MEC to reduce the inference latency further becomes attractive. However, the NOMA based communication during split inference has not been properly considered in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(5 citation statements)
references
References 56 publications
0
5
0
Order By: Relevance
“…The number of multi-constrained task sets [2][3][4][5][6][7][8][9][10][11][12][13][14] The number of edge servers [14][15][16] The virtual machine capacity of edge servers [15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30] The CPU frequency of edge servers [2000-2500] MHz…”
Section: Parameter Valuementioning
confidence: 99%
See 1 more Smart Citation
“…The number of multi-constrained task sets [2][3][4][5][6][7][8][9][10][11][12][13][14] The number of edge servers [14][15][16] The virtual machine capacity of edge servers [15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30] The CPU frequency of edge servers [2000-2500] MHz…”
Section: Parameter Valuementioning
confidence: 99%
“…For improving resource allocation efficiency and promoting multiple resource sharing, Lin et al 27 used dominant resource fairness and MMF to allocate available and remaining resources, respectively. To cope with the unfairness among mobile devices due to the near‐far effect, Huang et al 28 adopted a new objective function for fair computational resource scheduling in MEC scenarios and verified the superiority of the proposed scheme in large‐scale and time‐sensitive tasks. For fairness‐aware resource management in the MEC network with multi‐user and multiserver, Guo et al 29 adopted the MMF resource management algorithm while considering the supply and demand of resources in task allocation, thereby greatly reducing the total time consumption.…”
Section: Related Workmentioning
confidence: 99%
“…The computation of the MEC server and local IoMT devices is 30 and 5 GHz, respectively. The healthcare data size of five patients is [5,10,15,20,25] Mbit, respectively. The required CPU cycles of each data packet are 500 cycles/bit.…”
Section: A Motivational Example Of Fair Healthcarementioning
confidence: 99%
“…Despite this, most existing MEC task offloading algorithms [5], [6] consider altruistic nodes just struggle to achieve a global optimum, which could not be applied in scenarios where multiple parties participate. There are also some works [8]- [10] focusing on general fair resource allocation in MEC but do not consider priority-aware and deadline-sensitive service characteristics in healthcare sectors. Consequently, we propose a long-term proportional fairness-driven edge healthcare scheme, referred to as FairHealth.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, different from the existing literature, we consider the problem of maximizing the minimum number of bits (of the high-complexity tasks) offloaded by users to an MEC server using NOMA by constraining the total energy, transmission power and the offloading delay within given limits. In this context, Huang et al studied a max-min computation efficiency problem in NOMA-MEC system [14], and a similar problem in a millimeter wave settings was explored in [15]. In order to improve the physicallayer security of offloaded data in a NOMA-MEC system, a max-min antieavesdropping poroblem was presented in [16].…”
mentioning
confidence: 99%