2004
DOI: 10.1016/s0031-3203(03)00431-x
|View full text |Cite
|
Sign up to set email alerts
|

On incremental and robust subspace learning

Abstract: Principal Component Analysis (PCA) has been of great interest in computer vision and pattern recognition. In particular, incrementally learning a PCA model, which is computationally efficient for large scale problems as well as adaptable to reflect the variable state of a dynamic system, is an attractive research topic with numerous applications such as adaptive background modelling and active object recognition. In addition, the conventional PCA, in the sense of least mean squared error minimisation, is susce… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 32 publications
(32 citation statements)
references
References 18 publications
0
32
0
Order By: Relevance
“…On the one hand, (20) states that MOSES successfully reduces the dimension of streaming data, namely, (14) holds under certain conditions: Loosely-speaking, (20)…”
Section: Discussion Of Theoremmentioning
confidence: 99%
See 4 more Smart Citations
“…On the one hand, (20) states that MOSES successfully reduces the dimension of streaming data, namely, (14) holds under certain conditions: Loosely-speaking, (20)…”
Section: Discussion Of Theoremmentioning
confidence: 99%
“…Moreover, MOSES approximately solves Program (7). In other words, MOSES satisfies both (14,15). These statements are made concrete below and proved in Section C of the supplementary material.…”
Section: Propositionmentioning
confidence: 91%
See 3 more Smart Citations