2019
DOI: 10.3390/electronics8101076
|View full text |Cite
|
Sign up to set email alerts
|

On-Demand Computation Offloading Architecture in Fog Networks

Abstract: With the advent of the Internet-of-Things (IoT), end-devices have been served as sensors, gateways, or local storage equipment. Due to their scarce resource capability, cloud-based computing is currently a necessary companion. However, raw data collected at devices should be uploaded to a cloud server, taking a significantly large amount of network bandwidth. In this paper, we propose an on-demand computation offloading architecture in fog networks, by soliciting available resources from nearby edge devices an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 31 publications
0
3
0
Order By: Relevance
“…This approach leverages Bayesian learning techniques to make informed decisions regarding the dynamic scaling of fog resources, determining both resource increases and decreases as needed. A policy for on-demand computation offloading in fog networks was presented in [13]. In this approach, a host node seeking additional computational resources beyond its own capabilities dynamically identifies accessible nodes and establishes an on-demand local resource provider network.…”
Section: Single-objective Approachesmentioning
confidence: 99%
“…This approach leverages Bayesian learning techniques to make informed decisions regarding the dynamic scaling of fog resources, determining both resource increases and decreases as needed. A policy for on-demand computation offloading in fog networks was presented in [13]. In this approach, a host node seeking additional computational resources beyond its own capabilities dynamically identifies accessible nodes and establishes an on-demand local resource provider network.…”
Section: Single-objective Approachesmentioning
confidence: 99%
“…In [9] a scalable full-stack distributed engine tailored to energy data collection and forecasting in the smart-city context enabling the integration of the heterogeneous data measurements gathered from real-world systems and the forecasting of average power demand. Moreover, related to fog computing, in [10] the authors introduce a new architecture enabling on-demand computation to offload the computation in the cloud architecture, reducing unnecessary network overhead, by properly selecting the most effective edge devices as computation delegates.…”
Section: Related Workmentioning
confidence: 99%
“…These cloud service providers have upgraded their infrastructure to produce more scalable and robust cloud computing to meet the demand with better availability and the increasing number of data centers [1]. The resources constraint faced by end-user devices ranging from IoT devices, computers, and physical servers has made cloud computing the providers of a preferred and flexible resource in deploying the solution [2], giving them an advantage on resources to focus on developing quality and fewer bugs software applications [3] before delivering to the end-users. Previously, in the early days of computing, software was installed and run separately and independently [4] in every machine.…”
Section: Introductionmentioning
confidence: 99%