2018
DOI: 10.1109/tcsi.2018.2792902
|View full text |Cite
|
Sign up to set email alerts
|

Design and Evaluation of Approximate Logarithmic Multipliers for Low Power Error-Tolerant Applications

Abstract: In this work, the designs of both non-iterative and iterative approximate logarithmic multipliers (LMs) are studied to further reduce power consumption and improve performance. Non-iterative approximate LMs (ALMs) that use three inexact mantissa adders, are presented. The proposed iterative approximate logarithmic multipliers (IALMs) use a set-one adder in both mantissa adders during an iteration; they also use lower-part-or adders and approximate mirror adders for the final addition. Error analysis and simula… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
54
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 126 publications
(60 citation statements)
references
References 39 publications
1
54
0
Order By: Relevance
“…In addition to hardware performance, we compare the normalized mean error distance (NMED) for all evaluated multipliers. The NMED is defined in [19], [38], [39] as average error distance over all input combinations normalized by the maximum output of the exact multiplier.…”
Section: Synthesis Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition to hardware performance, we compare the normalized mean error distance (NMED) for all evaluated multipliers. The NMED is defined in [19], [38], [39] as average error distance over all input combinations normalized by the maximum output of the exact multiplier.…”
Section: Synthesis Resultsmentioning
confidence: 99%
“…Liu et al [19] utilize truncated approximate adders for mantissa addition, which delivers a design with smaller barrel shifters. Their multiplier offers similar accuracy as Mitchell's multiplier and at the same time, delivers a smaller power-delay product.…”
Section: A Approximate Logarithmic Multipliersmentioning
confidence: 99%
“…Exponentials and logarithms are important in deep neural networks [21]. Approximate multipliers that extend Mitchell's technique are presented and analyzed in [22].…”
Section: Some Classical Tricks Of the Tradementioning
confidence: 99%
“…Relative difference between Approximation (55) to √ with the constant 127 • 222 replaced by 532369100 and the actual √ in[1,4].…”
mentioning
confidence: 99%
“…There are many other ways of approximating multiplica-tion that had not been applied to deep CNNs, such as [43], [44], [45] among countless others. While we believe that the studied multiplier designs are the most promising, there are most likely other related opportunities for improving CNNs.…”
Section: Related Workmentioning
confidence: 99%