2019
DOI: 10.1016/j.cma.2018.10.028
|View full text |Cite
|
Sign up to set email alerts
|

An adaptive local reduced basis method for solving PDEs with uncertain inputs and evaluating risk

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 19 publications
(20 citation statements)
references
References 34 publications
0
20
0
Order By: Relevance
“…Again, the proposed approach provides approximately two orders of magnitude improvement over the structured adaptive sparse grid methods due to the flexibility of working with an unstructured set of data points. Furthermore, convergence is comparable to the reduced‐basis method proposed in the work of Zou et al…”
Section: Numerical Examplesmentioning
confidence: 59%
See 3 more Smart Citations
“…Again, the proposed approach provides approximately two orders of magnitude improvement over the structured adaptive sparse grid methods due to the flexibility of working with an unstructured set of data points. Furthermore, convergence is comparable to the reduced‐basis method proposed in the work of Zou et al…”
Section: Numerical Examplesmentioning
confidence: 59%
“…This problem has previously been solved by Zou et al in a forthcoming work, which proposes an adaptive reduced‐basis method for similar problems. The results presented here are compared with those presented in the work of Zou et al and have been developed through personal correspondence with the authors.…”
Section: Numerical Examplesmentioning
confidence: 99%
See 2 more Smart Citations
“…The adaptive nature of the construction which, moreover, is informed by the error estimate makes it potentially more effective than the approach of Reference 12. There have been other techniques in the literature to alleviate the RB offline cost such as the parameter domain adaptivity, 15,16 greedy sampling acceleration through nonlinear optimization, 17,18 and local RBM, 19‐22 and so forth.…”
Section: Introductionmentioning
confidence: 99%