2013
DOI: 10.1109/tsp.2013.2276440
|View full text |Cite
|
Sign up to set email alerts
|

Convergence and Applications of a Gossip-Based Gauss-Newton Algorithm

Abstract: The Gauss-Newton algorithm is a popular and efficient centralized method for solving non-linear least squares (NLLS) problems. In this paper, a multi-agent distributed version of this algorithm is proposed to solve general NLLS problems in a network, named Gossip-based Gauss-Newton (GGN) algorithm. Furthermore, we analyze and present sufficient conditions for its convergence and show numerically that the GGN algorithm achieves performance comparable to the centralized algorithm, with graceful degradation in ca… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
18
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 32 publications
(18 citation statements)
references
References 55 publications
(159 reference statements)
0
18
0
Order By: Relevance
“…Some recent developments include the EXTRA algorithm proposed in [7], which guarantees convergence of the consensus-based gradient descent algorithm using fixed step size. Combined algorithms that mix consensus technique with other optimization methods have also been developed, e.g., the alternating direction method of multipliers (ADMM) [13][14][15], the primal-dual method [16] and the Gauss-Newton method [17]. Most of this prior art focuses on convex optimization problems.…”
Section: Relation To Prior Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Some recent developments include the EXTRA algorithm proposed in [7], which guarantees convergence of the consensus-based gradient descent algorithm using fixed step size. Combined algorithms that mix consensus technique with other optimization methods have also been developed, e.g., the alternating direction method of multipliers (ADMM) [13][14][15], the primal-dual method [16] and the Gauss-Newton method [17]. Most of this prior art focuses on convex optimization problems.…”
Section: Relation To Prior Workmentioning
confidence: 99%
“…Most of this prior art focuses on convex optimization problems. Only a few related works on consensus-based decentralized algorithms exist for nonconvex optimization, e.g., [14,[17][18][19]. In particular, a stochastic decentralized algorithm have been proposed in [18] for tackling a class of non-convex problems.…”
Section: Relation To Prior Workmentioning
confidence: 99%
“…In light of this, various authors have proposed to tackle (1) through decentralized algorithms that are built on the average consensus protocol [9,10]. For example, [11][12][13][14][15][16][17][18] studied the decentralized gradient methods; [19,20] considered the Newton-type methods; [21][22][23][24][25] considered the decentralized primal-dual algorithms. [26,27] are built on the successive convex approximation framework.…”
Section: Introductionmentioning
confidence: 99%
“…[26,27] are built on the successive convex approximation framework. The convergence properties of these algorithms were investigated extensively, especially for convex objectives [4,[12][13][14][15][16][17][18][19][20][21][22][23][24][25]; for non-convex objectives, a few recent results can be found in [11,[25][26][27][28][29]. However, most prior work are projection-based such that each iteration of the above algorithms necessitates a projection step onto the constraint set C, or solving a sub-problem with similar complexity.…”
Section: Introductionmentioning
confidence: 99%
“…The algorithm employs network gossiping [16], [17] based on the uncoordinated random exchange protocol [18]. The convergence of this method is analytically proven in [19].…”
Section: Introductionmentioning
confidence: 99%