2019 IEEE 58th Conference on Decision and Control (CDC) 2019
DOI: 10.1109/cdc40024.2019.9029689
|View full text |Cite
|
Sign up to set email alerts
|

Distributed Constraint-Coupled Optimization over Random Time-Varying Graphs via Primal Decomposition and Block Subgradient Approaches

Abstract: In this paper, we consider a network of processors that want to cooperatively solve a large-scale, convex optimization problem. Each processor has knowledge of a local cost function that depends only on a local variable. The goal is to minimize the sum of the local costs, while making the variables satisfy both local constraints and a global coupling constraint. We propose a simple, fully distributed algorithm, that works in a random, time-varying communication model, where at each iteration multiple edges are… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 8 publications
(12 citation statements)
references
References 17 publications
0
12
0
Order By: Relevance
“…Lemma 1: For all j ∈ I, the mapping x j in ( 16) is θ j ρ j -Lipschitz continuous, with θ j as in (10).…”
Section: Convergence Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Lemma 1: For all j ∈ I, the mapping x j in ( 16) is θ j ρ j -Lipschitz continuous, with θ j as in (10).…”
Section: Convergence Analysismentioning
confidence: 99%
“…Block-coordinate versions of the dual ascent, where only part of the variables is updated at each iteration, are also explored in the literature [9]. More generally, a variety of distributed algorithms has been proposed to solve constraintcoupled optimization problems, possibly with block-updates and time-varying communication [10], [11], [12]. Nonetheless, in all the cited works, a common clock is employed to synchronize the communication and update frequencies.…”
mentioning
confidence: 99%
“…Lemma 1: For all j ∈ I, the mapping x ⋆ j in ( 16) is θj ρj -Lipschitz continuous, with θ j as in (10).…”
Section: Algorithm 1 Asynchronous Distributed Dual Ascentmentioning
confidence: 99%
“…This work was partially supported by NWO under research project OMEGA (grant n. 613.001.702) and by the ERC under research project COSMOS (802348). and time-varying communication [10], [11], [12]. Nonetheless, in all the cited works, a common clock is employed to synchronize the communication and update frequencies.…”
Section: Introductionmentioning
confidence: 99%
“…Under the assumptions that the agents are able to build more and more refined estimates of their cost functions, we propose a novel, distributed algorithm to solve the problem exactly. The algorithm is inspired to a distributed primal decomposition approach for constraint-coupled optimization [14], [22], where the algorithm has been suitably modified to account for the online cost estimation mechanism. The resulting scheme is a three-step procedure where each agent first obtains an updated version of the cost estimation, then it solves a local version of the original problem with the true cost function replaced by an estimated version, and finally updates a local state after exchanging dual information with neighbors.…”
Section: Introductionmentioning
confidence: 99%