2020
DOI: 10.1609/aaai.v34i04.5714
|View full text |Cite
|
Sign up to set email alerts
|

Pursuit of Low-Rank Models of Time-Varying Matrices Robust to Sparse and Measurement Noise

Abstract: In tracking of time-varying low-rank models of time-varying matrices, we present a method robust to both uniformly-distributed measurement noise and arbitrarily-distributed “sparse” noise. In theory, we bound the tracking error. In practice, our use of randomised coordinate descent is scalable and allows for encouraging results on changedetection.net, a benchmark.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
1

Relationship

4
2

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 27 publications
0
10
0
Order By: Relevance
“…It has already been argued that unstructured methods generally require less functional assumptions than structured ones. For example, unstructured methods have been proposed for various nonstrongly convex problems, as well nonconvex cost functions, where notions of dynamic regret can be used as performance indicators (see [17], [18], [65], and [105]- [107]). An attractive feature of time-varying nonconvex optimization algorithms is that they can be free of locally optimal trajectories.…”
Section: A Wider Classes Of Problemsmentioning
confidence: 99%
“…It has already been argued that unstructured methods generally require less functional assumptions than structured ones. For example, unstructured methods have been proposed for various nonstrongly convex problems, as well nonconvex cost functions, where notions of dynamic regret can be used as performance indicators (see [17], [18], [65], and [105]- [107]). An attractive feature of time-varying nonconvex optimization algorithms is that they can be free of locally optimal trajectories.…”
Section: A Wider Classes Of Problemsmentioning
confidence: 99%
“…In conclusion, MACO makes it possible to find stationary points of an NP-Hard problem in matrix completion under uncertainty rather efficiently. The simple and seemingly obvious addition of inequality constraints to matrix completion seems to improve the statistical performance of matrix completion in a number of applications, such as collaborative filtering under interval uncertainty, robust statistics, event detection [7,9], and background modelling in computer vision [5,6,2,1]. We hope this may spark further research, both in terms of dealing with uncertainty in matrix completion and in terms of the efficient algorithms for the same.…”
Section: Discussionmentioning
confidence: 99%
“…Then the data structure thus produced is passed to the matrix-factorization component, which factorizes the data, and creates two matrices L and R. One of the factors (matrix R) is then passed to the subspace-proximity tester, which uses it to assess whether incoming sensor readings present abnormal behavior or not and report the results to the end user. Finally, subspace-proximity tester relays the data back to the matrix-factorization component to update the input matrix, replacing the oldest data present, and updating online [30]- [32], if needed.…”
Section: B the Modulor Frameworkmentioning
confidence: 99%