ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9413800
|View full text |Cite
|
Sign up to set email alerts
|

Approximate Weighted C R Coded Matrix Multiplication

Abstract: One of the most common operations in signal processing is matrix multiplication. However, it presents a major computational bottleneck when the matrix dimension is high, as can occur for large data size or feature dimension. Two different approaches to overcoming this bottleneck are: 1) low rank approximation of the matrix product; and 2) distributed computation. We propose a scheme that combines these two approaches. To enable distributed low rank approximation, we generalize the approximate matrix CR-multipl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 24 publications
(31 reference statements)
0
9
0
Order By: Relevance
“…There is a benefit when applying a garbled Hadamard transform in this scenario, as the complexity of multiplication resulting from the sketching is less than that of regular multiplication. Also, if such a random projection is used before performing CR-multiplication distributively [15], [16], [21], the approximate result will be the same. Moreover, our dimensionality reduction algorithm can be utilized by a single server, to store low-rank approximations of very large data matrices.…”
Section: Concluding Remarks and Future Workmentioning
confidence: 99%
“…There is a benefit when applying a garbled Hadamard transform in this scenario, as the complexity of multiplication resulting from the sketching is less than that of regular multiplication. Also, if such a random projection is used before performing CR-multiplication distributively [15], [16], [21], the approximate result will be the same. Moreover, our dimensionality reduction algorithm can be utilized by a single server, to store low-rank approximations of very large data matrices.…”
Section: Concluding Remarks and Future Workmentioning
confidence: 99%
“…The paper titled "Anytime Coding for Distributed Computation" [47] proposes replicating subtasks according to the job, while [33] and [48] incorporate sketching into CC. It is worth noting that even though we focus on gradient methods in this paper; our approach also applies to second-order methods, as well as approximate matrix products through the CR-multiplication algorithm [9], [28], [49]. We briefly discuss this in Section V.…”
Section: A Related Workmentioning
confidence: 99%
“…This framework accommodates a central class of sketching algorithms, that of importance (block) sampling algorithms (e.g. CU R decomposition [27], CR-multiplication [28]). Coded computing is a novel computing paradigm that utilizes coding theory to effectively inject and leverage data and computation redundancy to mitigate errors and slow or non-responsive servers; known as stragglers, among other fundamental bottlenecks, in large-scale distributed computing.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations