2023
DOI: 10.1109/tac.2022.3213608
|View full text |Cite
|
Sign up to set email alerts
|

Convergence Analysis of Dual Decomposition Algorithm in Distributed Optimization: Asynchrony and Inexactness

Abstract: Dual decomposition is widely utilized in the distributed optimization of multiagent systems. In practice, the dual decomposition algorithm is desired to admit an asynchronous implementation due to imperfect communication, such as time delay and packet drop. In addition, computational errors also exist when the individual agents solve their own subproblems. In this article, we analyze the convergence of the dual decomposition algorithm in the distributed optimization when both the communication asynchrony and t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 46 publications
0
1
0
Order By: Relevance
“…The existing literature that designs adaptive stepsizes primarily considers subgradient methods within the primal domain. These results can be readily applied to distributed algorithms that are based on duality [36]- [39]. More specifically, in the case of dual decomposition algorithms for distributed optimization, the master problem that coordinates the subproblems at subsystems is often solved by using subgradient methods [40] where the earlier results for stepsizes can be adopted.…”
Section: Introductionmentioning
confidence: 99%
“…The existing literature that designs adaptive stepsizes primarily considers subgradient methods within the primal domain. These results can be readily applied to distributed algorithms that are based on duality [36]- [39]. More specifically, in the case of dual decomposition algorithms for distributed optimization, the master problem that coordinates the subproblems at subsystems is often solved by using subgradient methods [40] where the earlier results for stepsizes can be adopted.…”
Section: Introductionmentioning
confidence: 99%