2020
DOI: 10.1108/ijicc-04-2020-0038
|View full text |Cite
|
Sign up to set email alerts
|

Exploring compression and parallelization techniques for distribution of deep neural networks over Edge–Fog continuum – a review

Abstract: PurposeThe trend of “Deep Learning for Internet of Things (IoT)” has gained fresh momentum with enormous upcoming applications employing these models as their processing engine and Cloud as their resource giant. But this picture leads to underutilization of ever-increasing device pool of IoT that has already passed 15 billion mark in 2015. Thus, it is high time to explore a different approach to tackle this issue, keeping in view the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 64 publications
0
10
0
Order By: Relevance
“…It has been widely used in various domains, such as transportation, health care, logistics, and agriculture [2]. In the IoT applications, millions of IoT devices are deployed and they continuously output large amounts of data [3], which are valuable for the enterprises to make reasonable business decisions in realtime [4]. However, how to process and analyze the IoT stream data are a big challenge for enterprises since traditional batch processing architecture cannot process large amounts of data in realtime.…”
Section: Introductionmentioning
confidence: 99%
“…It has been widely used in various domains, such as transportation, health care, logistics, and agriculture [2]. In the IoT applications, millions of IoT devices are deployed and they continuously output large amounts of data [3], which are valuable for the enterprises to make reasonable business decisions in realtime [4]. However, how to process and analyze the IoT stream data are a big challenge for enterprises since traditional batch processing architecture cannot process large amounts of data in realtime.…”
Section: Introductionmentioning
confidence: 99%
“…Deep learning has been applied in several areas, including detecting fake news, self-driving cars, healthcare, visual recognition, and entertainment (Ghosh et al, 2021 ; Kar et al, 2019 ; Smith & Lovgren, 2018 ). Deep learning technology has also been successfully used in predictive planning, manufacturing, supply chain management, scheduling, forecasting, capacity allocation, inventory optimization, and so on (Chatterjee et al, 2021 ; Murphy & de Jongh, 2011 ; Nazir et al, 2020 ).…”
Section: Literature Reviewmentioning
confidence: 99%
“…Deep learning can be conceptualized as investigating existing cognitive structures and establishing various links to other concepts, realities, and ideas (Biggs, 1999 ; Entwistle, 1989 ). Deep learning models are concerned with areas such as time-setting data management, financial issues (Chatterjee et al, 2020a ; Harmancioglu et al, 2010 ; Kumar et al, 2018 ; Nazir et al, 2020 ). In this context, predictive maintenance capability seems to play an important role in influencing firms to adopt smart manufacturing systems (Hassan, 2017 ).…”
Section: Literature Reviewmentioning
confidence: 99%
“…create an ecosystem for achieving inference models with excellent performance. In this regard, many attempts have been reported to optimize the DNN models at edge devices [61], [62]. While Communication load, communication overhead, cost, memory, processing speed, network bandwidth, jitter, complexity are a few performance parameters, much of the preliminary research has focused on low-latency and energyefficient computations.…”
Section: Task Parallelizationmentioning
confidence: 99%