2023
DOI: 10.1109/access.2023.3234761
|View full text |Cite
|
Sign up to set email alerts
|

Enabling All In-Edge Deep Learning: A Literature Review

Abstract: In recent years, deep learning (DL) models have demonstrated remarkable achievements on non-trivial tasks such as speech recognition, image processing, and natural language understanding. One of the significant contributors to its success is the proliferation of end devices that acted as a catalyst to provide data for data-hungry DL models. However, computing DL training and inference is the main challenge. Usually, central cloud servers are used for the computation, but it opens up other significant challenge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(13 citation statements)
references
References 268 publications
0
13
0
Order By: Relevance
“…The complexity of computation and architecture in DL-based model is another challenge in this area. One potential solution for reducing the complexity of computation and architecture in DL models is to use model compression techniques such as pruning, quantization, and low-rank factorization [ 124 ]. These techniques can help reduce the number of parameters and computational resources required while maintaining good performance.…”
Section: Challenges Potential Solutions and Future Prospects Of Ai Me...mentioning
confidence: 99%
“…The complexity of computation and architecture in DL-based model is another challenge in this area. One potential solution for reducing the complexity of computation and architecture in DL models is to use model compression techniques such as pruning, quantization, and low-rank factorization [ 124 ]. These techniques can help reduce the number of parameters and computational resources required while maintaining good performance.…”
Section: Challenges Potential Solutions and Future Prospects Of Ai Me...mentioning
confidence: 99%
“…Summing up the representative works from another perspective, we can find some common schemes, as shown in Table 2. IRS element selection, phase-shift control and power allocation [60] Transmitter power, PS factor of UE and IRS phase shifts matrix [82] Task offloading rate, power consumption and computational overhead [94] Task offloading and resource allocation [95] Latency, interference management and computational resource management [70] Simplification of DRL decision space Adopt a decision selection network to filter inappropriate mode alterations from the action space of QNet model [75] Adopt LSTM to simplify the original state-action space and produce an approximate state-action space [94] Adopt the concept of rolling horizon control to slow down the growth of action aggregations [95] Deep learning as an alternative to traditional analytical methods…”
Section: Frequently Used Techniquesmentioning
confidence: 99%
“…Obtain training data by AO (alternative optimization)-based scheme) to train a FFNN [82] Obtain training data from traditional ILP (Integer Linear Programming) to train an E-CNN [99] Resolve a NP-hard MINLP (Mix-Integer Nonlinear Programming) problem using MADRL [60] Distributed deep learning Beam forming by DRL-based Federal Learning, to reduce communication overhead and avoid phase synchronization [64] Use muti-agent DRL (MADRL) to efficiently coordinate multi-AP and multi-IRS in a distributed manner, at lower information exchange costs [60] DRL trained by MADDPG to tradeoff between energy efficiency and accuracy [77] FL (distributed DRL) to cope with the dynamic context of 6G [97] MADDPG to integrate computation and communication into edge nodes of deep edge network, aiming at pervasive intelligence in edge computing [98] First, multi-objective optimization. Due to the characteristics like sensitivity to signal blockage, the antennas and IRS of THz or mmWave communication exhibit high complexity.…”
Section: Frequently Used Techniquesmentioning
confidence: 99%
See 2 more Smart Citations