2019
DOI: 10.1109/jproc.2019.2922285
|View full text |Cite
|
Sign up to set email alerts
|

Computation Offloading Toward Edge Computing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
140
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 346 publications
(142 citation statements)
references
References 169 publications
0
140
0
2
Order By: Relevance
“…We believe that this step is necessary, in order to efficiently set up a comprehensive reference system architecture. However, we also expect that part of this implementation will smoothly pass towards the edge, exploiting the well-known benefits of edge processing [9], and we hope that our experience may be useful to define effective specifications and support development in an automotive embedded environment.…”
Section: Introductionmentioning
confidence: 94%
“…We believe that this step is necessary, in order to efficiently set up a comprehensive reference system architecture. However, we also expect that part of this implementation will smoothly pass towards the edge, exploiting the well-known benefits of edge processing [9], and we hope that our experience may be useful to define effective specifications and support development in an automotive embedded environment.…”
Section: Introductionmentioning
confidence: 94%
“…Lin et al [35] analyzed computation offloading for edge computing. An insight into the architecture and types of edge computing nodes was given.…”
Section: A Existing Surveys and Tutorialsmentioning
confidence: 99%
“…[34] 2018 × √ × Ref. [35] 2019 √ × × Ref. [36] 2019 √ Our survey √ √ √ complete comprehensive outlook of all aspects of computation offloading.…”
Section: B Contributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…The implementation of deep learning in edge devices is challenging. Since the computation of deep learning requires high computing performance [10], edge computing gives many researchers another chance to run the deep learning model in a limited resource such as the edge device. The complexity of the model will increase the memory usage and execution time, which in turn increases power consumption.…”
Section: Introductionmentioning
confidence: 99%