2021 IEEE Global Communications Conference (GLOBECOM) 2021
DOI: 10.1109/globecom46510.2021.9685858
|View full text |Cite
|
Sign up to set email alerts
|

Optimization-based Block Coordinate Gradient Coding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 10 publications
0
7
0
Order By: Relevance
“…The master aims to minimize the risk function with respect to the model parameter θ using commonly used gradient descent methods. 1 To handle a massive amount of training data, the master implements a gradient descent method with the help of all workers. Specifically, the master partitions the whole data set and sends to each worker some particular subsets so that the master and N workers can collaboratively compute the gradient ∇ θ ˆ (θ; D) y∈D ∇ θ (θ; y) in each iteration.…”
Section: System Settingmentioning
confidence: 99%
See 4 more Smart Citations
“…The master aims to minimize the risk function with respect to the model parameter θ using commonly used gradient descent methods. 1 To handle a massive amount of training data, the master implements a gradient descent method with the help of all workers. Specifically, the master partitions the whole data set and sends to each worker some particular subsets so that the master and N workers can collaboratively compute the gradient ∇ θ ˆ (θ; D) y∈D ∇ θ (θ; y) in each iteration.…”
Section: System Settingmentioning
confidence: 99%
“…We would like to minimize E [τ (s, T)] by optimizing the coding parameters s for the L coordinates under the constraints in (1).…”
Section: A Problem Formulationmentioning
confidence: 99%
See 3 more Smart Citations