2022 IEEE International Symposium on Information Theory (ISIT) 2022
DOI: 10.1109/isit50566.2022.9834439
|View full text |Cite
|
Sign up to set email alerts
|

Private Read Update Write (PRUW) with Storage Constrained Databases

Abstract: We investigate the problem of private read update write (PRUW) with heterogeneous storage constrained databases in federated submodel learning (FSL). In FSL a machine learning (ML) model is divided into multiple submodels based on different types of data used to train it. A given user downloads, updates and uploads the updates back to a single submodel of interest, based on the type of user's local data. With PRUW, the process of reading (downloading) and writing (uploading) is carried out such that informatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 52 publications
0
3
0
Order By: Relevance
“…Note from (35) that for segment 1, the two real sparse subpackets 2 and 4 have been correctly updated, while ensuring that the rest of the subpackets remain the same, without revealing the real subpacket indices 2 and 4 to any of the databases. The resulting writing cost is given by,…”
Section: General Schemes With Examplesmentioning
confidence: 99%
See 2 more Smart Citations
“…Note from (35) that for segment 1, the two real sparse subpackets 2 and 4 have been correctly updated, while ensuring that the rest of the subpackets remain the same, without revealing the real subpacket indices 2 and 4 to any of the databases. The resulting writing cost is given by,…”
Section: General Schemes With Examplesmentioning
confidence: 99%
“…Apart from the privacy leakage, another drawback of FL is the large communication cost incurred by sharing model parameters and updates with millions of users in multiple rounds. Some of the solutions to this problem include, gradient quantization [25][26][27][28], federated submodel learning (FSL) [29][30][31][32][33][34][35][36][37], and gradient sparsification [38][39][40][41][42][43][44][45]. In gradient quantization, the values of the gradients are quantized and represented with a fewer number of bits.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation