Institute of Mathematical Statistics Collections 2008
DOI: 10.1214/193940307000000518
|View full text |Cite
|
Sign up to set email alerts
|

Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems

Abstract: It has been widely realized that Monte Carlo methods (approximation via a sample ensemble) may fail in large scale systems. This work offers some theoretical insight into this phenomenon in the context of the particle filter. We demonstrate that the maximum of the weights associated with the sample ensemble converges to one as both the sample size and the system dimension tends to infinity. Specifically, under fairly weak assumptions, if the ensemble size grows sub-exponentially in the cube root of the system … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

6
355
0
2

Year Published

2012
2012
2020
2020

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 309 publications
(363 citation statements)
references
References 28 publications
6
355
0
2
Order By: Relevance
“…In a series of related papers, Bengtsson et al (2008), Bickel et al (2008), and Snyder et al (2008) show that the curse of dimensionality is also manifest in the simplest particle filter. They demonstrate that the required ensemble size scales exponentially with a statistic related, in part, to the system dimension and that may be considered as an effective dimension.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…In a series of related papers, Bengtsson et al (2008), Bickel et al (2008), and Snyder et al (2008) show that the curse of dimensionality is also manifest in the simplest particle filter. They demonstrate that the required ensemble size scales exponentially with a statistic related, in part, to the system dimension and that may be considered as an effective dimension.…”
Section: Introductionmentioning
confidence: 99%
“…In the case examined by Bengtsson et al (2008), Bickel et al (2008), and Snyder et al (2008), the proposal is the transition distribution for the system dynamics, where new particles are generated by evolving particles from the previous time under the system dynamics. It yields the bootstrap filter of Gordon et al (1993) and was termed the ''standard'' proposal by Snyder (2012) and Snyder et al (2015).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…A number of judicious choices of importance densities, along with various resampling strategies have been proposed in an attempt to tackle this issue [2], [6]. Although many of these have provided satisfactory results in a number of low-dimensional state-space applications [4], [6], they nevertheless remain inefficient in the cases of very large dimensional system [7]- [11]. This is mainly due to the fact that the number of particles required to sufficiently sample the state-space needs to be very large; in some special situations it has been shown that the required number of particles scales exponentially with the system dimension (see e.g.…”
Section: Introductionmentioning
confidence: 99%
“…Most data mining algorithms suffer from the "curse of dimensionality" [9], which means that their performance on any task deteriorates, sometimes very quickly, as the dimensionality of the problem increases [127]. Thus, scaling up a data mining algorithm not only involves improving its speed but also maintaining the quality of its solution.…”
Section: Introductionmentioning
confidence: 99%