2019
DOI: 10.1016/j.automatica.2019.04.015
|View full text |Cite
|
Sign up to set email alerts
|

Fast robust methods for singular state-space models

Abstract: State-space models are used in a wide range of time series analysis applications. Kalman filtering and smoothing are work-horse algorithms in these settings. While classic algorithms assume Gaussian errors to simplify estimation, recent advances use a broad range of optimization formulations to allow outlier-robust estimation, as well as constraints to capture prior information.Here we develop methods on state-space models where either transition or error covariances may be singular. These models frequently ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 8 publications
(11 citation statements)
references
References 21 publications
0
11
0
Order By: Relevance
“…In our implementation, we need only compute a single block bidiagonal factorization once, which can then be used to solve (8) in O(n 2 N ) operations in each iteration, no more expensive than a single matrix-vector multiply. For piecewise-linear quadratic ρ [26,15], DRS converges to an optimal solution at a local linear rate [16], which does not depend on the condition number of A. A good initialization makes DRS competitive with the fastest available solvers, even second order methods with quadratic local rates [15].…”
Section: Algorithm 1 Douglas-rachford Splitting (Drs)mentioning
confidence: 99%
See 2 more Smart Citations
“…In our implementation, we need only compute a single block bidiagonal factorization once, which can then be used to solve (8) in O(n 2 N ) operations in each iteration, no more expensive than a single matrix-vector multiply. For piecewise-linear quadratic ρ [26,15], DRS converges to an optimal solution at a local linear rate [16], which does not depend on the condition number of A. A good initialization makes DRS competitive with the fastest available solvers, even second order methods with quadratic local rates [15].…”
Section: Algorithm 1 Douglas-rachford Splitting (Drs)mentioning
confidence: 99%
“…In this paper, we build on the recently proposed framework of [16] for singular models, and systematically develop complementary modeling elements: robust penalties, informative constraints, and singular models. The resulting approach exploits the structure of singular covariances head on rather than using workarounds such as pseudo-inverses or variance boosting that either do not work in the general setting or introduce unnecessary changes to the fundamental model (see discussion in [16]). A simple The red dash-dot shows a'robust' Huberized approach implemented using a pseudo-inverse; green dash shows the proposed singular 2 estimate; blue solid shows the proposed singular Huber estimate, which clearly tracks the true state.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…rr (r m )R −1/2 (Hx) θ We now have, fully and explicitly, first and second derivatives of the value function v(θ) in (7) for the nonsingular case. Though these results are straightforward, they do not appear in any smoothing literature we are aware of in this compact form, even for least squares losses.…”
Section: Nonsingular Ssmmentioning
confidence: 99%
“…A singular covariance matrix precludes any errors and innovations that are not in its range. We follow [7] in formulating this problem:…”
Section: Singular Ssmmentioning
confidence: 99%