2019
DOI: 10.1016/j.sysarc.2019.01.007
|View full text |Cite
|
Sign up to set email alerts
|

A Survey and Taxonomy of FPGA-based Deep Learning Accelerators

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 66 publications
(23 citation statements)
references
References 16 publications
0
23
0
Order By: Relevance
“…Different solutions for the acceleration of DNN algorithms include Graphic Processing Unit (GPU) [11], Application Specific Integrated Circuit (ASIC) [12], and Field-Programmable Gate Array (FPGA) [13]. The advantages of GPU-based solutions are programmability and comparatively low cost while providing the high computational power, which is typically required during the training phase of deep learning algorithms.…”
Section: A Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Different solutions for the acceleration of DNN algorithms include Graphic Processing Unit (GPU) [11], Application Specific Integrated Circuit (ASIC) [12], and Field-Programmable Gate Array (FPGA) [13]. The advantages of GPU-based solutions are programmability and comparatively low cost while providing the high computational power, which is typically required during the training phase of deep learning algorithms.…”
Section: A Related Workmentioning
confidence: 99%
“…As a result, a large computational time is required. Therefore, some dedicated architectures such as GPUs [11], FPGAs [13] and ASICs [14] are required.…”
Section: A Case Study: Epileptic Seizure Recognitionmentioning
confidence: 99%
“…Deep learning originates from the discussion and exploration of artificial neural network (ANN) and deep neural networks (DNN), and it is a deep machine learning model [ 22 ]. The deep learning executes a series of nonlinear transformation to study how to automatically extract the multilayer characteristics from the original data, and it has been widely applied in the field of image recognition, speech recognition, natural language processing, drug discovery, and so on [ 23 ].…”
Section: Computational Logistics and Deep Learningmentioning
confidence: 99%
“…The motivation behind deploying such a network on the edge with FPGA is to balance between low power consumption and high performance. Thus, FPGA design of DCNNs has become an active research topic in recent years [18][19][20].…”
Section: Introductionmentioning
confidence: 99%