2012
DOI: 10.1109/tsp.2012.2217338
|View full text |Cite
|
Sign up to set email alerts
|

Diffusion Strategies Outperform Consensus Strategies for Distributed Estimation Over Adaptive Networks

Abstract: Adaptive networks consist of a collection of nodes with adaptation and learning abilities. The nodes interact with each other on a local level and diffuse information across the network to solve estimation and inference tasks in a distributed manner. In this work, we compare the mean-square performance of two main strategies for distributed estimation over networks: consensus strategies and diffusion strategies.The analysis in the paper confirms that under constant step-sizes, diffusion strategies allow inform… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

8
296
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 410 publications
(304 citation statements)
references
References 53 publications
8
296
0
Order By: Relevance
“…However, the ultimate purpose of state estimation problems is to achieve at each node an estimate that minimizes a predefined cost function, which does not necessarily require that all nodes provide the same result. Moreover, it is shown in [115] that the consensus network can become unstable even if all the local filters are stable, i.e., cooperation by means of consensus algorithms may lead to disastrous consequences. Motivated by such observations, the estimation schemes based on diffusion strategies have been proposed.…”
Section: Distributed Estimation For Networked Systemsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, the ultimate purpose of state estimation problems is to achieve at each node an estimate that minimizes a predefined cost function, which does not necessarily require that all nodes provide the same result. Moreover, it is shown in [115] that the consensus network can become unstable even if all the local filters are stable, i.e., cooperation by means of consensus algorithms may lead to disastrous consequences. Motivated by such observations, the estimation schemes based on diffusion strategies have been proposed.…”
Section: Distributed Estimation For Networked Systemsmentioning
confidence: 99%
“…Diffusion filter belongs to the single time-scale estimation scheme, i.e., the communication requirement of distributed filter is comparable to that of the gossip filter, but diffusion networks can achieve faster convergence rate and lower MSE than consensus networks. In addition, it is proved in [115] that the stability of local filters is sufficient to guarantee global stability of network under the diffusion framework, regardless of the choice of combination weights.…”
Section: Distributed Estimation For Networked Systemsmentioning
confidence: 99%
“…It was verified in [13], [28] that a sufficient condition to ensure ρ(B) < 1 is to select the site-sizes {µ k } such that…”
Section: Assumption 1 (Strongly Connected Network)mentioning
confidence: 99%
“…Compared with the class of consensus strategies [21]- [27], diffusion networks have been shown to remain stable irrespective of the network topology, while consensus networks can become unstable even when each agent is individually stable [28]. Diffusion strategies have also been shown to lead to improved convergence rate and superior mean-square-error performance [14], [28].…”
mentioning
confidence: 99%
“…The consensus averaging technique [5] has been studied in linear distributed estimation problems using distributed Kalman Filter (KF) [6] and also in nonlinear problems using Particle Filter (PF) [4]. It has the property of asymptotically reaching the solution of the centralized approach, but one of its drawbacks is the potentially prohibitive communication overhead due to the multiiterative consensus step [7,8]. Due to this shortcoming, other techniques have been suggested in the literature which require less communication bandwidth [9].…”
Section: Introductionmentioning
confidence: 99%