2020 International Conference on Field-Programmable Technology (ICFPT) 2020
DOI: 10.1109/icfpt51103.2020.00010
|View full text |Cite
|
Sign up to set email alerts
|

Mapping Multiple LSTM models on FPGAs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…Following [37], we conduct a design space exploration to find a parameter set that gives the best performance under FPGA resource and bandwidth constraints. Due to page limit, we briefly describe the performance model for each module, which is defined as:…”
Section: Design Space Exploration For P3netcorementioning
confidence: 99%
“…Following [37], we conduct a design space exploration to find a parameter set that gives the best performance under FPGA resource and bandwidth constraints. Due to page limit, we briefly describe the performance model for each module, which is defined as:…”
Section: Design Space Exploration For P3netcorementioning
confidence: 99%
“…Authors of [6,24] have shown that deeper networks are more resilient, and the use of batch normalization layers in the neural network architecture helps in generalizing and improving the resiliency of the network model. In [34] was figured that, that the impact of fault is more when it happens at the back of the network (i.e., in the last layers), whereas faults effects tend to be mitigated or neutralized if happening in the initial layers of the network ( i.e., the first layer). In [6,27] was demonstrated that pruning and quantization also assist in increasing the resiliency of the network model.…”
Section: Factors Affecting the Resiliency Of Deep Neural Networkmentioning
confidence: 99%
“…An early approach in this direction was presented in [13] in the context of multi-LSTM applications. The proposed scheme tunably decomposes the weight matrices of multiple LSTM models and represents them with a shared lowrank representation.…”
Section: Approximate Computing For Multiple Dnnsmentioning
confidence: 99%