2018
DOI: 10.48550/arxiv.1810.01109
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

AI Benchmark: Running Deep Neural Networks on Android Smartphones

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(12 citation statements)
references
References 0 publications
0
12
0
Order By: Relevance
“…Table 2 summarizes their specifications 5 . Note we only use the smartphone with DSP rather than that with NPU, since 1) NPUs are only programmable with vendor-provided Software Development Kits (SDKs) which have not been publicly released yet [42], and 2) DSPs in recent mobile SoCs are optimized for DNN inference so that they can act as NPUs [42,77].…”
Section: Real System Measurement Infrastructurementioning
confidence: 99%
See 2 more Smart Citations
“…Table 2 summarizes their specifications 5 . Note we only use the smartphone with DSP rather than that with NPU, since 1) NPUs are only programmable with vendor-provided Software Development Kits (SDKs) which have not been publicly released yet [42], and 2) DSPs in recent mobile SoCs are optimized for DNN inference so that they can act as NPUs [42,77].…”
Section: Real System Measurement Infrastructurementioning
confidence: 99%
“…To address these performance and energy efficiency challenges, modern mobile devices employ more and more accelerators and/or co-processors, such as Graphic Processing Units (GPU), Digital Signal Processors (DSPs), and Neural Processing Units (NPUs) [10,42], scaling up the overall system performance. Furthermore, the mobile system stack support for DNNs has become more mature, allowing DNN inference to leverage the computation and energy efficiency advantages provided by the co-processors.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The common approach for running the current AI-based mobile applications is to send the data from the mobile device (front-end, or edge) to the cloud (back-end), run AI models in the cloud, and send the results back. New generations of mobile devices are capable of running deep neural models on board [1], however, this presents a great challenge due to limited power and computing resources available. In [2], it is shown that in many cases, the optimal strategy in terms of energy consumption and computation latency is to split the deep model and distribute the computation between the front-end and the back-end.…”
Section: Introductionmentioning
confidence: 99%
“…We have witnessed the wide spread of deep neural networks and their applications in the last decade. With ever growing computing power of embedded or edge devices, neural networks are being adopted to such devices, further assisted by AI accelerators [2,10,24,25,27,38,44,48]. In mobile phone industry, this trend has already become obvious enough to be adopted by major manufacturers including Samsung and Apple [2,44].…”
Section: Introductionmentioning
confidence: 99%