2017 European Conference on Networks and Communications (EuCNC) 2017
DOI: 10.1109/eucnc.2017.7980678
|View full text |Cite
|
Sign up to set email alerts
|

Proactive edge computing in latency-constrained fog networks

Abstract: Abstract-In this paper, the fundamental problem of distribution and proactive caching of computing tasks in fog networks is studied under latency and reliability constraints. In the proposed scenario, computing can be executed either locally at the user device or offloaded to an edge cloudlet. Moreover, cloudlets exploit both their computing and storage capabilities by proactively caching popular task computation results to minimize computing latency. To this end, a clustering method to group spatially proxima… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
67
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 112 publications
(67 citation statements)
references
References 13 publications
0
67
0
Order By: Relevance
“…In addition, mmWave also enables wireless backhauling [15], [16] that facilitates edge servers' prefetching popular content with low latency. At the processing level, proactive computing provides significant latency reduction while maximizing resource efficiency by avoiding repetitive and redundant on-demand computing [17]- [19]. Next, coded computing is effective in reducing parallel computing latency, which eliminates the dependency of processing tasks, thereby minimizing the worst-case latency due to a straggling task.…”
Section: Low Latency Enablersmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, mmWave also enables wireless backhauling [15], [16] that facilitates edge servers' prefetching popular content with low latency. At the processing level, proactive computing provides significant latency reduction while maximizing resource efficiency by avoiding repetitive and redundant on-demand computing [17]- [19]. Next, coded computing is effective in reducing parallel computing latency, which eliminates the dependency of processing tasks, thereby minimizing the worst-case latency due to a straggling task.…”
Section: Low Latency Enablersmentioning
confidence: 99%
“…The ideas of prefetching tasks [48] and proactive computing [17], [18] aim to develop techniques that learns and predicts which tasks are to be requested in the future and pre-compute them. Indeed, the success of proactive computing lies on a well-aimed choice of which tasks to proactively compute and which are to leave for real-time processing.…”
Section: Low Latency Enabler 4 Proactive Computingmentioning
confidence: 99%
“…This is particularly helpful when the SBS performs caching since this process is based on the user's preference. The steps involved in our proposed clustering scheme are inherited from the work of ElBamby et al and summarized as follows. The SBSs calculate the distance between users as a weighted function of the spatial distances and social distances (SDs), where the SD is meant to characterize how the preference between users differ.…”
Section: Simulation Experimentsmentioning
confidence: 99%
“…This is particularly helpful when the SBS performs caching since this process is based on the user's preference. The steps involved in our proposed clustering scheme are inherited from the work of ElBamby et al 22 and summarized as follows.…”
Section: Clustering Schemementioning
confidence: 99%
“…In [24], a coded prefetching and the corresponding delivery strategy was proposed, which relies on a combination of rank metric codes and maximum distance separable (MDS) codes in a non-binary finite field. In [25], a novel centralized coded caching scheme was proposed that approaches the ratememory region achieved by the scheme in [24] as the number of users in the system increases, which only requires a finite field of 2 2 . Moreover, instead of relying on the existence of some valid code, an explicit combinatorial construction of the caching scheme was provided.…”
mentioning
confidence: 99%