2021
DOI: 10.1109/jsait.2021.3105359
|View full text |Cite
|
Sign up to set email alerts
|

Quantization of Distributed Data for Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 18 publications
0
8
0
Order By: Relevance
“…This process achieved up to 4.86 times improvement in the network lifetime [ 33 ]. They also modified an algorithm wherein the sink can predict its next movement and presented simulation results to verify this approach [ 34 ].…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…This process achieved up to 4.86 times improvement in the network lifetime [ 33 ]. They also modified an algorithm wherein the sink can predict its next movement and presented simulation results to verify this approach [ 34 ].…”
Section: Resultsmentioning
confidence: 99%
“…Step 11: Repeat from step 3 until all the nodes are dead. also modified an algorithm wherein the sink can predict its next movement and presented simulation results to verify this approach [34]. Data delivery latency influenced the speed of the mobile sink.…”
Section: Resultsmentioning
confidence: 99%
“…Note that studying "how quantization of covariate affects the recovery/learning" is meaningful especially when the problem is interpreted as sparse linear regression -working with low-precision data in some (distributed) learning systems could significantly reduce communication cost and power consumption [43], [96], which we will further demonstrate in Section IV-A. However, almost all of the prior works are restricted to response quantization.…”
Section: A Related Workmentioning
confidence: 99%
“…Indeed, to reduce the power consumption and computational cost, it is sometimes preferable to work with low-precision data in a machine learning system, e.g., the sample quantization scheme developed in [96] led to experimental success in training linear model. Also, it was shown that direct gradient quantization may not be efficient in certain distributed learning systems where the terminal nodes are connected to the server only through a very weak communication fabric and the number of parameters is extremely huge; rather, quantizing and transmitting some important samples could provably reduce communication cost [43]. In fact, the process of data collection may already appeal to quantization due to certain limits of the data acquisition device (e.g., a low-resolution analog-to-digital module used in distributed signal processing [25]).…”
Section: A Covariate Quantizationmentioning
confidence: 99%
See 1 more Smart Citation