2021
DOI: 10.14529/jsfi210104
|View full text |Cite
|
Sign up to set email alerts
|

Computational Resource Consumption in Convolutional Neural Network Training – A Focus on Memory

Abstract: Deep neural networks (DNNs) have grown in popularity in recent years thanks to the increase in computing power and the size and relevance of data sets. This has made it possible to build more complex models and include more areas of research and application. At the same time, the amount of data generated during the training process of these models puts great pressure on the capacity and bandwidth of the memory subsystem and, as a direct consequence, has become one of the biggest bottlenecks for the scalability… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 24 publications
(36 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?