2023
DOI: 10.1109/tdsc.2021.3139318
|View full text |Cite
|
Sign up to set email alerts
|

More Efficient Secure Matrix Multiplication for Unbalanced Recommender Systems

Abstract: With recent advances in homomorphic encryption (HE), it becomes feasible to run non-interactive machine learning (ML) algorithms on encrypted data without decryption. In this work, we propose novel encoding methods to pack matrix in a compact way and more efficient methods to perform matrix multiplication on homomorphically encrypted data, leading to a speed boost of 1.5 × −20× for slim rectangular matrix multiplication compared with state-of-the-art. Moreover, we integrate our optimized secure matrix arithmet… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 39 publications
0
5
0
Order By: Relevance
“…Source matrices can be enlarged to suit shape requirements for MM with variable shapes, although this may increase processing time and resource utilization. Huang et al [28] advocated using blocking to better handle rectangular MM with source matrices as block matrices with square matrices. This method is appealing for big matrices that cannot fit in one ciphertext.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Source matrices can be enlarged to suit shape requirements for MM with variable shapes, although this may increase processing time and resource utilization. Huang et al [28] advocated using blocking to better handle rectangular MM with source matrices as block matrices with square matrices. This method is appealing for big matrices that cannot fit in one ciphertext.…”
Section: Related Workmentioning
confidence: 99%
“…When l = min{m, l, n}, we make no change of A and B (lines 11 -15). After initializing several relevant variables (lines 16 -18), Algorithm 2 goes through a loop to compute and accumulate the partial products (lines [19][20][21][22][23][24][25][26][27][28]. To be more specific, we first conduct ϵ and ω transformations based on the expanded matrix (A or B) (lines 20 -21), which are combined together into C temp using the element-wise HE multiplication (line 22).…”
Section: N = Min{m L N}mentioning
confidence: 99%
“…The main goal of our algorithm is to reduce the number of rotations and multiplications used, which occupy most of the algorithm runtime (see Section 5.1 for the costs of each operation in HE). Note that including a transpose in multiplication is more efficient than computing matrix multiplication of the form AB directly as in (Jiang et al, 2018;Huang et al, 2021), since we need to perform transpose for each iteration of training, which is a costly operation and requires additional multiplicative depth.…”
Section: Encrypted Matrix Multiplicationmentioning
confidence: 99%
“…We experimentally showed that the execution times of our algorithms are significantly smaller than that of (Crockett, 2020). Some encrypted matrix multiplication algorithms (Jiang et al, 2018;Huang et al, 2021) are of the form AB, that is, they do not include transpose. As mentioned previously, these are unsuitable for our purpose because they add transpose operations for each training iteration.…”
Section: Encrypted Matrix Multiplicationmentioning
confidence: 99%
“…On one hand, encryption increases the management cost for cloud service providers in handling user data. On the other hand, encrypted data significantly limits its operability, making it difficult to implement common data services such as keyword search [5,6,7] and online editing [8,9].…”
Section: Introductionmentioning
confidence: 99%