2017
DOI: 10.4236/cs.2017.86010
|View full text |Cite
|
Sign up to set email alerts
|

A Multithreaded CGRA for Convolutional Neural Network Processing

Abstract: Convolutional neural network (CNN) is an essential model to achieve high accuracy in various machine learning applications, such as image recognition and natural language processing. One of the important issues for CNN acceleration with high energy efficiency and processing performance is efficient data reuse by exploiting the inherent data locality. In this paper, we propose a novel CGRA (Coarse Grained Reconfigurable Array) architecture with timedomain multithreading for exploiting input data locality. The m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…However, limited experiments are done to evaluate the performance of EMAX on CNN. In [23], a multithread CGRA (called M-CGRA) are proposed to accelerate CNN only. However, the object inference flow not only contains CNN, but also includes other traditional algorithms.…”
Section: B Application Perspectivementioning
confidence: 99%
See 2 more Smart Citations
“…However, limited experiments are done to evaluate the performance of EMAX on CNN. In [23], a multithread CGRA (called M-CGRA) are proposed to accelerate CNN only. However, the object inference flow not only contains CNN, but also includes other traditional algorithms.…”
Section: B Application Perspectivementioning
confidence: 99%
“…M-CGRA [23] is a CGRA architecture that is designed for CNN acceleration. For comparison purpose, we list the mapping results of AlexNet on M-CGRA and SDT-CGRA in Table VII.…”
Section: E Comparison With Cgra Implementationsmentioning
confidence: 99%
See 1 more Smart Citation
“…They have been successfully used in domains such as text and speech. However, RNNs are susceptible to overfitting; regularization is important [3]. Motivated by these networks we consider the Tickysim SpiNNaker Model(network) for utilization its topological properties.…”
Section: Introductionmentioning
confidence: 99%