2019
DOI: 10.1007/978-3-030-20870-7_7
|View full text |Cite
|
Sign up to set email alerts
|

Hardware-Aware Softmax Approximation for Deep Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(13 citation statements)
references
References 15 publications
0
13
0
Order By: Relevance
“…Moreover, use of CNN is hindered in small hardwares, especially in mobile devices because of its high computational cost. In this regard, different hardware accelerators are needed for reducing both execution time and power consumption [247]. Some of the very interesting accelerators are already proposed.…”
Section: Future Directionsmentioning
confidence: 99%
“…Moreover, use of CNN is hindered in small hardwares, especially in mobile devices because of its high computational cost. In this regard, different hardware accelerators are needed for reducing both execution time and power consumption [247]. Some of the very interesting accelerators are already proposed.…”
Section: Future Directionsmentioning
confidence: 99%
“…Conversely, the softmax operations used in the attention module are approximated following a more complex methodology, similar to the one used in [26]. Consider the matrix M as defined in Equation 1 and a given row j in M , and a vector m j as input to the softmax operation.…”
Section: Integer Precision Approximationmentioning
confidence: 99%
“…But there is still problem when the CNN architectures are consuming large resources for computation purpose and also overhead [26][27][28]. Various types of hardware accelerators and their architectures are discussed in [29] for reducing the power consumption and large overheads. An example of such accelerator is FGPA discussed in [30] for minimizing power consumption.…”
Section: Literature Reviewmentioning
confidence: 99%
“…An example of such accelerator is FGPA discussed in [30] for minimizing power consumption. Further, using the modern techniques, some hardware related models were introduced in [31]. The modern research also relies on the various optimization techniques.…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation