2020
DOI: 10.1080/21642583.2020.1737846
|View full text |Cite
|
Sign up to set email alerts
|

A survey on distributed filtering, estimation and fusion for nonlinear systems with communication constraints: new advances and prospects

Abstract: In this paper, some recent results on the distributed filtering, estimation and fusion algorithms for nonlinear systems with communication constraints are reviewed. First, some network-induced phenomena and communication protocols in the networked environment are recalled. Second, the recent advances of distributed fusion algorithms for nonlinear systems are discussed, which include distributed fusion extended Kalman filtering, distributed fusion unscented Kalman filtering, distributed fusion cubature Kalman f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
8
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(8 citation statements)
references
References 173 publications
(149 reference statements)
0
8
0
Order By: Relevance
“…Next, to facilitate obtaining Equations ( 23)-( 27), we determine an expression for the one-stage observation predictor of y (j) s based on the observations of sensor i, which will be denoted by y (j/i) s/s−1 . This expression is easily obtained from (7), taking into account the model assumptions and (11) for x (j) s/s−1 and x…”
Section: Appendix a Proof Of Theoremmentioning
confidence: 99%
See 1 more Smart Citation
“…Next, to facilitate obtaining Equations ( 23)-( 27), we determine an expression for the one-stage observation predictor of y (j) s based on the observations of sensor i, which will be denoted by y (j/i) s/s−1 . This expression is easily obtained from (7), taking into account the model assumptions and (11) for x (j) s/s−1 and x…”
Section: Appendix a Proof Of Theoremmentioning
confidence: 99%
“…Consequently, this method usually has stronger fault tolerance and lower calculation burdens than the centralised approach. Accordingly, the distributed fusion estimation problem has been of particular interest to researchers, and a large number of distributed fusion estimation algorithms have been developed, using different techniques, in the context of WSNs (see [1][2][3][4][5][6][7] and references therein).…”
Section: Introductionmentioning
confidence: 99%
“…Sensor networks consisting of lots of nodes are commonly utilised to monitor physical or environmental conditions, and also serve the requirements of control, estimation and fault detection of networked systems (Chen, Ding, et al, 2019;Hu et al, 2020;Ma et al, 2018Ma et al, , 2019. In recent years, benefiting from the enhancement of sensing and communication capabilities of sensor nodes, they have a more wide-scope domain of applications such as environment monitoring, health care applications, intelligent transportation, robotics, and industrial and manufacturing automation (Ding et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Distributed fusion estimation, on the other hand, does not generally provide optimal estimators, but reduces the computational load and is usually more suitable for large-scale sensor networks with random transmission failures, because of its parallel structure. Due to these advantages, the use of distributed fusion estimation in multiple-sensor systems has attracted considerable interest in recent years, with various approaches being taken; for detailed information, see [1][2][3][4][5][6][7] and the references therein.…”
Section: Introductionmentioning
confidence: 99%
“…(i) k,S = 0, S ≥ k. Next, we derived expression(21) for Φ (ij) k,S . For S = k − 1, k, k + 1, expression (21) of Φ (ij) k,S , is clear from (3) taking into account that J x(ij) S−1,S = E O x(ij) S−1 µ (j)T S .In order to calculate Φ (ij) k,S for S ≥ k + 2, we use(7) for x To determine the first expectation in (A4), we use (2) for y (j)S and taking into account that by the OPL E x (ji)T S−a,S−2 , and, again, from (7) for x To compute the second expectation in (A4), first, from (6), we write the observation predictor as y using expression (7) of the local smoother, after some manipulation, we obtain E x…”
mentioning
confidence: 99%