2021
DOI: 10.1109/tmc.2019.2963668
|View full text |Cite
|
Sign up to set email alerts
|

Coding for Distributed Fog Computing in Internet of Mobile Things

Abstract: Internet of Mobile Things (IoMTs) refers to the interconnection of mobile devices, for example, mobile phones, vehicles, robots, etc. For mobile data, strong extra processing resources are normally required due to the limited physical resources of the mobile devices in IoMTs. Due to latency or bandwidth limitations, it may be infeasible to transfer a large amounts of mobile data to remote server for processing. Thus, distributed computing is one of the potential solutions to overcome these limitations. We cons… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 19 publications
(12 citation statements)
references
References 35 publications
0
12
0
Order By: Relevance
“…Meanwhile, it is inefficient (sometimes even infeasible) to transmit all data to a central node for analysis. For the reason, distributed machine learning (DML), which stores and processes all or parts of data in different nodes, has attracted significant research interests and applications [ 1 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 ]. There are different methods of implementing DML, i.e., primal method (e.g., distributed gradient descend [ 4 , 7 ], federated learning [ 5 , 6 ]) and primal–dual method (e.g., alternating direction method of multipliers (ADMM)) [ 16 ].…”
Section: Background and Motivationsmentioning
confidence: 99%
“…Meanwhile, it is inefficient (sometimes even infeasible) to transmit all data to a central node for analysis. For the reason, distributed machine learning (DML), which stores and processes all or parts of data in different nodes, has attracted significant research interests and applications [ 1 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 ]. There are different methods of implementing DML, i.e., primal method (e.g., distributed gradient descend [ 4 , 7 ], federated learning [ 5 , 6 ]) and primal–dual method (e.g., alternating direction method of multipliers (ADMM)) [ 16 ].…”
Section: Background and Motivationsmentioning
confidence: 99%
“…The action performed by the agent is to allocate network resources for the end user function request based on the physical node mapping probability and the Floyd algorithm. We use the joint optimization result calculated by formula (14) as the agent's reward. Then we use the gradient descent method to train the neural network, the gradient calculation method is,…”
Section: Training and Testingmentioning
confidence: 99%
“…The edge server can serve end users in the region, and the system tasks can be completed through message passing coordination between edge servers. Taking into account the characteristics of the extensive deployment of SAGIN, it will be extremely necessary to adopt a distributed management solution for SAGIN [14].…”
Section: Introductionmentioning
confidence: 99%
“…One major challenge is that many computing frameworks are vulnerable to uncertain disturbances, such as node/link failures, communication congestion, and slow-downs [6]. Such disturbances, which can be modeled as stragglers that are slow or even fail in returning results, have been observed in many largescale computing systems such as cloud computing [7], mobile edge computing [8], and fog computing [9].…”
Section: Introductionmentioning
confidence: 99%