2023
DOI: 10.1002/oca.2973
|View full text |Cite
|
Sign up to set email alerts
|

A stochastic averaging gradient algorithm with multi‐step communication for distributed optimization

Abstract: This paper studies distributed convex optimization problems over an undirected network where all nodes cooperate to minimize a sum of local objective functions. Each local objective function is further assumed to be an average of several convex instantaneous functions. By incorporating the stochastic averaging gradient into the distributed first-order primal-dual method, a stochastic averaging gradient algorithm with multi-step communication is proposed to solve the optimization problem. For each node, one ran… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 43 publications
(124 reference statements)
0
0
0
Order By: Relevance