2019
DOI: 10.1016/j.future.2018.10.051
|View full text |Cite
|
Sign up to set email alerts
|

Multi-criteria optimal task allocation at the edge

Abstract: In Internet of Things (IoT), numerous nodes produce huge volumes of data that are subject of various processing tasks. Tasks execution on top of the collected data can be realized either at the edge of the network or at the Fog/Cloud. Their management at the network edge may limit the required time for concluding responses and return the final outcome/analytics to endusers or applications. IoT nodes, due to their limited computational and resource capabilities, can execute a limited number of tasks over the co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 43 publications
(33 citation statements)
references
References 41 publications
0
33
0
Order By: Relevance
“…1). This supports applications requiring pro-active decision making for: (i) allocating tasks/analytics queries only to relevant ENs based on their synopses studied in [30] and [49], (ii) share and update ML models in Federated Learning [5] for environmental monitoring based on synopses and network load, (iii) transfer tasks for ML models training & inference ENs based on network congestion [36], and (iv) separating context data by distributively gathering similar data to same datasets studied in [28]. As exemplified in Fig.…”
Section: A Edge Computing Infrastructurementioning
confidence: 89%
“…1). This supports applications requiring pro-active decision making for: (i) allocating tasks/analytics queries only to relevant ENs based on their synopses studied in [30] and [49], (ii) share and update ML models in Federated Learning [5] for environmental monitoring based on synopses and network load, (iii) transfer tasks for ML models training & inference ENs based on network congestion [36], and (iv) separating context data by distributively gathering similar data to same datasets studied in [28]. As exemplified in Fig.…”
Section: A Edge Computing Infrastructurementioning
confidence: 89%
“…Within this context, [8] proposes a two-step decision process for choosing which tasks will be executed locally at the IoT nodes where they are present while the rest will be migrated to a group of peer nodes in the network or on a fog/cloud server, to maximize performance. In [9], the authors propose a combined fog/cloud architecture to minimize the overall latency of the services that request resources by reducing the cloud access delay in an IoT situation.…”
Section: Preliminaries and Related Workmentioning
confidence: 99%
“…We finally integrate the latency constraint in Eq. (8), that is, the overall latency of the system cannot exceed the designer-defined latency threshold. It should be stressed out that for Equation (5)/Equation ( 8), the first summation considers the computation energy/latency while the second summation considers the communication energy/latency.…”
Section: B Reliability Optimization Problem Formulationmentioning
confidence: 99%
“…We propose a cost function for delivering the cost of an allocation for pairs of queries nodes and implement the solution of our problem adopting the Hungarian Method (adopting the specific algorithm, we enjoy the lowest possible complexity and return the result in the minimum time). We also focus on the 'one-shot' allocation, i.e., the allocation is accepted by the selected node instead of performing an additional reasoning whether the query can be executed locally or be offloaded to peer nodes based on a reward/cost function and the distance in the network (this is the subject of [14]). The efficient allocation of queries to a number of nodes is also the subject of our previous efforts discussed in [15], [16], [17] and [18].…”
Section: Introductionmentioning
confidence: 99%