1999
DOI: 10.1109/78.752626
|View full text |Cite
|
Sign up to set email alerts
|

A new tensor product formulation for Toom's convolution algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
9
0

Year Published

1999
1999
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 6 publications
0
9
0
Order By: Relevance
“…Therefore, the total number of multiplications required is equal to the number of multiplications in the single-core stage and is equal to [1]. Tables I and II show the computational advantage of the proposed improved algorithm when compared to previous algorithms, such as the original matrix-vector multiplication and the fast Fourier transform (FFT) algorithms.…”
Section: Multiplexed Architecture Of the 1-d Convolutionmentioning
confidence: 94%
See 3 more Smart Citations
“…Therefore, the total number of multiplications required is equal to the number of multiplications in the single-core stage and is equal to [1]. Tables I and II show the computational advantage of the proposed improved algorithm when compared to previous algorithms, such as the original matrix-vector multiplication and the fast Fourier transform (FFT) algorithms.…”
Section: Multiplexed Architecture Of the 1-d Convolutionmentioning
confidence: 94%
“…The proposed work is based on a nontrivial modification of the one-dimensional (1-D) convolution algorithm presented in [1] and shown in Fig. 1.…”
mentioning
confidence: 99%
See 2 more Smart Citations
“…We employ several techniques to manipulate such decompositions into suitable expressions that can be mapped efficiently onto very large scale integration (VLSI) structures. Tensor products (or Kronecker products), when coupled with permutation matrices, have proven to be useful in providing a unified decomposable matrix, formulations for multidimensional transforms, convolutions, matrix multiplication, and other fundamental computations [2], [3], [6].…”
Section: Introductionmentioning
confidence: 99%