2019
DOI: 10.1137/18m1189324
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Matrix-Free Adaptive Product-Convolution Approximation for Locally Translation-Invariant Operators

Abstract: We present an adaptive grid matrix-free operator approximation scheme based on a "product-convolution" interpolation of convolution operators. This scheme is appropriate for operators that are locally translation-invariant, even if these operators are high-rank or full-rank. Such operators arise in Schur complement methods for solving partial differential equations (PDEs), as Hessians in PDE-constrained optimization and inverse problems, as integral operators, as covariance operators, and as Dirichlet-to-Neuma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 58 publications
0
7
0
Order By: Relevance
“…For example, the results of section 5.1.5 for the Poisson coefficient inverse problem indicate that š‘‚ (10 6 ) PDE solves may still be required even with the most efficient MCMC methods. In such cases, hIPPYlib-MUQ can be used as a prototyping environment to study new methods that further exploit problem structure, for example through the use of various reduced models (e.g., [20]) or via advanced Hessian approximations that go beyond low rank [2,3].…”
Section: Discussionmentioning
confidence: 99%
“…For example, the results of section 5.1.5 for the Poisson coefficient inverse problem indicate that š‘‚ (10 6 ) PDE solves may still be required even with the most efficient MCMC methods. In such cases, hIPPYlib-MUQ can be used as a prototyping environment to study new methods that further exploit problem structure, for example through the use of various reduced models (e.g., [20]) or via advanced Hessian approximations that go beyond low rank [2,3].…”
Section: Discussionmentioning
confidence: 99%
“…In the context of PDEs, product-convolution expansions are encountered in Schur complement methods for solving PDEs, Dirichlet-to-Neumann maps and in PDE-constrained optimization as Hessians. In [1], the functions (u k , v k ) m k=1 are chosen as above, but instead of being defined on a Euclidean grid, locations (y k ) m k=1 are adaptively sampled to best approximate the operator with a fixed number m.…”
Section: Naive Interpolation Of Impulse Responsesmentioning
confidence: 99%
“…Product-convolution expansions have appeared at least 3 decades ago and have been given different names in various fields, see e.g. [5,25,15,17,10,1]. A remarkable aspect of these expansions is their essential role in the interpolation of linear operators from scattered impulse responses.…”
Section: Introductionmentioning
confidence: 99%
“…Also exploiting the pseudodifferential structure of seismic inversion Hessians is the matrix probing method of [31], which approximates the Hessian (and its inverse) with basis matrices stemming from the Hessian's symbol and find their coefficients by probing the Hessian in random directions. Recently, methods that exploit the local translation invariance of Hessians have been introduced [74,3]. The adaptive product-convolution approximation in particular is demonstrated to be robust to the Peclet number for advection-dominated transport and the frequency for an auxiliary operator that arises in connection with KKT preconditioning [3] for a wave inverse problem [2].…”
mentioning
confidence: 99%
“…Recently, methods that exploit the local translation invariance of Hessians have been introduced [74,3]. The adaptive product-convolution approximation in particular is demonstrated to be robust to the Peclet number for advection-dominated transport and the frequency for an auxiliary operator that arises in connection with KKT preconditioning [3] for a wave inverse problem [2]. Here we focus on comparisons with low rank approximation, and defer comparison to these other methods in appropriate contexts for future work.…”
mentioning
confidence: 99%