2018
DOI: 10.1016/j.jcp.2018.05.025
|View full text |Cite
|
Sign up to set email alerts
|

A near-optimal sampling strategy for sparse recovery of polynomial chaos expansions

Abstract: Compressive sampling has become a widely used approach to construct polynomial chaos surrogates when the number of available simulation samples is limited. Originally, these expensive simulation samples would be obtained at random locations in the parameter space. It was later shown that the choice of sample locations could significantly impact the accuracy of resulting surrogates. This motivated new sampling strategies or design-of-experiment approaches, such as coherence-optimal sampling, which aim at improv… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
24
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 24 publications
(27 citation statements)
references
References 49 publications
2
24
1
Order By: Relevance
“…ADM can also be integrated with other optimization methods to solve the compressive sensing problem, e.g., OMP [6], ℓ 1−2 minimization [59], etc. Further, it could be advantageous to integrate our method with sampling strategies (e.g., [3,4]), basis selection method (e.g., [23]), or Bayesian approach (e.g., [25]). The combination of these methods can be especially useful for problems where experiments or simulations are costly and where a good surrogate model of the QoI is needed, e.g., in inverse problems based on a Bayesian framework ( [40,57]).…”
Section: Discussionmentioning
confidence: 99%
“…ADM can also be integrated with other optimization methods to solve the compressive sensing problem, e.g., OMP [6], ℓ 1−2 minimization [59], etc. Further, it could be advantageous to integrate our method with sampling strategies (e.g., [3,4]), basis selection method (e.g., [23]), or Bayesian approach (e.g., [25]). The combination of these methods can be especially useful for problems where experiments or simulations are costly and where a good surrogate model of the QoI is needed, e.g., in inverse problems based on a Bayesian framework ( [40,57]).…”
Section: Discussionmentioning
confidence: 99%
“…Compressive sampling first appeared in the field of signal processing, with the objective of recovering a sparse signal with significantly smaller number of samples compared to that from the conventional Shannon-Nyquist sampling rate [19]. Compressive sampling has been extensively applied in various fields where small sampling rate is desirable [20]- [23]. In [17], motivated by the fact that discrete cosine transformation for connected vehicles data could be (approximately) sparse, compressive sampling was used to perform discrete cosine transformation with a small number of samples, thereby reducing the number of data transmissions.…”
Section: B Compressive Samplingmentioning
confidence: 99%
“…The coherence-optimal sampling strategy presented in Section 2.3 is known to produce at least as accurate as PC approximations compared to standard Monte Carlo sampling from f (ξ) for both LSA and ℓ 1 -minimization [21,44,53,80,79]. Further, constructing D-optimal designs from a large pool of candidate samples, regardless of how the candidate samples are generated, can improve least squares PC approximation accuracy compared to designs constructed randomly [60,53].…”
Section: Subspace Pursuit With D-coherence-optimal Samplingmentioning
confidence: 99%