2021
DOI: 10.1109/tsp.2020.3048229
|View full text |Cite
|
Sign up to set email alerts
|

Compressed Gradient Methods With Hessian-Aided Error Compensation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…The compressors mentioned above can be unified into three general classes. Specifically, [30] proposed a wider class of compressors with bounded relative compression error which covers the compressors used in [13]- [18]; [31] considered a general class of compressors with globally bounded absolute compression error which accommodates the compressors used in [21]- [23]; and [32] studied a general class of compressors with locally bounded absolute compression error which contains the compressors used in [24]- [29]. These studies also analyzed the convergence properties of the proposed algorithms.…”
Section: A Related Work and Motivationmentioning
confidence: 99%
See 2 more Smart Citations
“…The compressors mentioned above can be unified into three general classes. Specifically, [30] proposed a wider class of compressors with bounded relative compression error which covers the compressors used in [13]- [18]; [31] considered a general class of compressors with globally bounded absolute compression error which accommodates the compressors used in [21]- [23]; and [32] studied a general class of compressors with locally bounded absolute compression error which contains the compressors used in [24]- [29]. These studies also analyzed the convergence properties of the proposed algorithms.…”
Section: A Related Work and Motivationmentioning
confidence: 99%
“…The same class of compressors satisfying Assumption 2 has also been used in [31], which incorporates the deterministic quantization used in [19]- [21] and the unbiased random quantization used in [21], [33], [38].…”
Section: A Compressorsmentioning
confidence: 99%
See 1 more Smart Citation
“…Inexact gradient methods are considered in [17]- [25]. These methods are directly related to dual decomposition methods.…”
Section: A Related Workmentioning
confidence: 99%
“…Errors in dual decomposition settings may not fit there, because the assumption can directly impose a requirement on the norm of the true subgradients. References [24], [25] discuss means of modeling inexactness of subgradients again from stochastic/deterministic points of view. They appear to be readily applied in a distributed optimization setting with dual decomposition.…”
Section: A Related Workmentioning
confidence: 99%