2018
DOI: 10.1109/tii.2018.2857203
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Fog Computing Based on Batched Sparse Codes for Industrial Control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 22 publications
(19 citation statements)
references
References 19 publications
0
19
0
Order By: Relevance
“…Different file system copy placement strategies are different. For example, HDFS file systems in IoT clusters use rack-aware data copy placement strategies [42], [43]. When the client uploads a file, the file system first places the data block on the node where the client is located, and then uses Pipline to store the second copy on other nodes in the same rack as the node where the client is located.…”
Section: Data Block Placement Methods a Rack-aware Based Date Plmentioning
confidence: 99%
“…Different file system copy placement strategies are different. For example, HDFS file systems in IoT clusters use rack-aware data copy placement strategies [42], [43]. When the client uploads a file, the file system first places the data block on the node where the client is located, and then uses Pipline to store the second copy on other nodes in the same rack as the node where the client is located.…”
Section: Data Block Placement Methods a Rack-aware Based Date Plmentioning
confidence: 99%
“…In WSNs, the coverage areas of sensor nodes can overlap to achieve the requirement of covering the monitored area without vulnerabilities [55]. From the perspective of saving node energy and extending the network lifetime, it is necessary to cover the monitoring area with the least number of nodes.…”
Section: A Assumptions and Basic Definitionsmentioning
confidence: 99%
“…Coding has been applied to distributed fog computing and machine learning for dealing with the problem of stragglers [4] and reducing the usage of computation and communication resources [5]. For coded distributed machine learning [2], matrix multiplication [6], [7] and gradient descent [8], [9] have attracted considerable attention.…”
Section: Introductionmentioning
confidence: 99%