2021
DOI: 10.1002/nla.2364
|View full text |Cite
|
Sign up to set email alerts
|

Randomized algorithms for generalized singular value decomposition with application to sensitivity analysis

Abstract: The generalized singular value decomposition (GSVD) is a valuable tool that has many applications in computational science. However, computing the GSVD for large-scale problems is challenging. Motivated by applications in hyper-differential sensitivity analysis (HDSA), we propose new randomized algorithms for computing the GSVD which use randomized subspace iteration and weighted QR factorization. Detailed error analysis is given which provides insight into the accuracy of the algorithms and the choice of the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 22 publications
(17 citation statements)
references
References 30 publications
0
17
0
Order By: Relevance
“…This approach is efficient for large problems which admit low rank structure, as is commonly observed in engineering applications. We refer the reader to [11,22] for more details.…”
Section: Discretizationmentioning
confidence: 99%
See 1 more Smart Citation
“…This approach is efficient for large problems which admit low rank structure, as is commonly observed in engineering applications. We refer the reader to [11,22] for more details.…”
Section: Discretizationmentioning
confidence: 99%
“…Hyper-Differential Sensitivity Analysis (HDSA) utilizes advanced numerical linear algebra for efficient computational of these sensitivities. The algorithmic generality of HDSA has been demonstated on a range of applications problems [11,22,23]. In the context of trajectory planning for hypersonic vehicles, a higher order numerical model is sampled for aerodynamic coefficients in the open-loop formulation to determine solutions that avoid saturating feedback controllers.…”
Section: Introductionmentioning
confidence: 99%
“…we conduct all the numerical experiments with A = U ΣV . This test case is directly inspired from [22,23]. We consider two different target ranks k (k = 5 and k = 15, respectively) to allow a variety of results and comments.…”
Section: 2mentioning
confidence: 99%
“…Each matrix-vector product with the misfit Hessian requires two PDE solves (assuming that adjoints are used [30]). Randomized methods afford greater parallelism than iterative methods when adequate computational resources are available [16,27]. Algorithm 5.2 adapts Algorithm 6 in [28] and provides a scalable (in inversion parameter dimension) approach to solve the Hessian GEVP by iterating on the number of desired eigenvalues (defined by the while loop at Line 5 of Algorithm 5.2) whose termination criteria comes from the interpretation of the eigenvalues as a ratio of contributions from the likelihood and prior (4.2).…”
Section: Proof (→)mentioning
confidence: 99%
“…Hyper-differential sensitivity analysis. Through a combination of tools from post-optimality sensitivity analysis, PDE-constrained optimization, and numerical linear algebra, HDSA has provided unique and valuable insights for optimal control and deterministic inverse problems [16,27,30]. This subsection provides essential background to prepare for our extension of HDSA to Bayesian inverse problems which follows.…”
mentioning
confidence: 99%