2021
DOI: 10.1007/s10915-021-01665-y
|View full text |Cite
|
Sign up to set email alerts
|

A Training Set Subsampling Strategy for the Reduced Basis Method

Abstract: We present a subsampling strategy for the offline stage of the Reduced Basis Method. The approach is aimed at bringing down the considerable offline costs associated with using a finely-sampled training set. The proposed algorithm exploits the potential of the pivoted QR decomposition and the discrete empirical interpolation method to identify important parameter samples. It consists of two stages. In the first stage, we construct a low-fidelity approximation to the solution manifold over a fine training set. … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 11 publications
(11 citation statements)
references
References 54 publications
0
11
0
Order By: Relevance
“…11: end while solution to the dual system by considering the right hand side as the i-th row vector of C(µ), namely, C T (: , i) in Eq. (9). Correspondingly, the residual r pr ( μ) is obtained by solving Eq.…”
Section: Remarkmentioning
confidence: 99%
See 4 more Smart Citations
“…11: end while solution to the dual system by considering the right hand side as the i-th row vector of C(µ), namely, C T (: , i) in Eq. (9). Correspondingly, the residual r pr ( μ) is obtained by solving Eq.…”
Section: Remarkmentioning
confidence: 99%
“…This is due to the fact that at each iteration of the greedy algorithm, an error estimator needs to be repeatedly computed for all the samples in the training set. Many adaptive training techniques have been proposed recently for RBM [9,18,19,23]. In contrast, no efficient training techniques are proposed for the interpolatory MOR methods, though similar greedy algorithms using fixed training sets are proposed in [11,13].…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations