2019
DOI: 10.48550/arxiv.1912.04481
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SMAUG: End-to-End Full-Stack Simulation Infrastructure for Deep Learning Workloads

Sam Likun Xi,
Yuan Yao,
Kshitij Bhardwaj
et al.

Abstract: In recent years, there has been tremendous advances in hardware acceleration of deep neural networks. However, most of the research has focused on optimizing accelerator microarchitecture for higher performance and energy efficiency on a per-layer basis. We find that for overall single-batch inference latency, the accelerator may only make up 25-40%, with the rest spent on data movement and in the deep learning software framework. Thus far, it has been very difficult to study end-to-end DNN performance during … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…As we can see, unlike STONNE, existing simulators were originally developed for first-generation DNN accelerators and do not give support for simulating flexible DNN architectures. This is at least in part because it is not possible, without significant "heavy lifting" to extend these simulation tools to support next-generation DNN accelerators, [8] DNNBuilder [9] MAERI BSV [10] D TVM [11] D SCALE-Sim [12] MAESTRO [13] D SMAUG [14]…”
Section: Introductionmentioning
confidence: 99%
“…As we can see, unlike STONNE, existing simulators were originally developed for first-generation DNN accelerators and do not give support for simulating flexible DNN architectures. This is at least in part because it is not possible, without significant "heavy lifting" to extend these simulation tools to support next-generation DNN accelerators, [8] DNNBuilder [9] MAERI BSV [10] D TVM [11] D SCALE-Sim [12] MAESTRO [13] D SMAUG [14]…”
Section: Introductionmentioning
confidence: 99%