2017 IEEE 56th Annual Conference on Decision and Control (CDC) 2017
DOI: 10.1109/cdc.2017.8264109
|View full text |Cite
|
Sign up to set email alerts
|

Gradient flows in uncertainty propagation and filtering of linear Gaussian systems

Abstract: The purpose of this work is mostly expository and aims to elucidate the Jordan-Kinderlehrer-Otto (JKO) scheme for uncertainty propagation, and a variant, the Laugesen-Mehta-Meyn-Raginsky (LMMR) scheme for filtering. We point out that these variational schemes can be understood as proximal operators in the space of density functions, realizing gradient flows. These schemes hold the promise of leading to efficient ways for solving the Fokker-Planck equation as well as the equations of non-linear filtering. Our a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
34
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 16 publications
(35 citation statements)
references
References 19 publications
1
34
0
Order By: Relevance
“…The purpose of this paper is to pursue a systems-theoretic variational viewpoint for computing ρ(x, t) that breaks away from the "solve PDE as a PDE" philosophy, and instead solves (2) as a gradient descent on the manifold of joint PDFs. This emerging geometric viewpoint for uncertainty propagation and filtering has been reported in our recent work [18], [19], but it remained unclear whether this viewpoint can offer computational benefit over the standard PDE solvers. It is not at all obvious whether and how an infinite-dimensional gradient descent can numerically obviate function approximation or spatial discretization.…”
Section: Introductionmentioning
confidence: 92%
“…The purpose of this paper is to pursue a systems-theoretic variational viewpoint for computing ρ(x, t) that breaks away from the "solve PDE as a PDE" philosophy, and instead solves (2) as a gradient descent on the manifold of joint PDFs. This emerging geometric viewpoint for uncertainty propagation and filtering has been reported in our recent work [18], [19], but it remained unclear whether this viewpoint can offer computational benefit over the standard PDE solvers. It is not at all obvious whether and how an infinite-dimensional gradient descent can numerically obviate function approximation or spatial discretization.…”
Section: Introductionmentioning
confidence: 92%
“…where π(x, y) denotes a joint probability measure supported on M × M, and the geodesic distance d G is given by (13). The infimum in (47) is taken over the set Π 2 (µ, ν), which we define as the set of all joint probability measures having finite second moment that are supported on M×M, with prescribed x-marginal µ, and prescribed y-marginal ν. If µ and ν have respective PDFs ρ x and ρ y , then we can use the notation W G (ρ x , ρ y ) in lieu of W G (µ, ν).…”
Section: A Wasserstein Gradient Flowmentioning
confidence: 99%
“…Just like the proximal viewpoint of finite dimensional gradient descent, (54) can be taken as an alternative definition of gradient descent of F (·) on the manifold P 2 (M) w.r.t. the metric W G ; see e.g., [47]- [50]. The utilitarian value of (54) over (52) is as follows.…”
Section: B Proximal Recursion On P 2 (M)mentioning
confidence: 99%
See 1 more Smart Citation
“…This allows interpreting the discrete time-stepping as steepest descent of the functional Φ w.r.t. distance d. Proximal operators have also been used in general Hilbert spaces [13], and in the space of probability density functions [4], [5], [11], [12], [14]- [16]. The idea of applying proximal recursion in the space of probability measures appeared first in [14]; see also [15].…”
Section: Main Ideamentioning
confidence: 99%