2020
DOI: 10.3390/e22020152
|View full text |Cite
|
Sign up to set email alerts
|

Gaussian Process Regression for Data Fulfilling Linear Differential Equations with Localized Sources

Abstract: Specialized Gaussian process regression is presented for data that are known to fulfill a given linear differential equation with vanishing or localized sources. The method allows estimation of system parameters as well as strength and location of point sources. It is applicable to a wide range of data from measurement and simulation. The underlying principle is the well-known invariance of the Gaussian probability distribution under linear operators, in particular differentiation. In contrast to approaches wi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 16 publications
0
9
0
Order By: Relevance
“…In this sense, the approach is similar to that of Section 6.2, although the kernel is not obtained from a transformation of a prior kernel but rather is constructed from solutions to the problem Lu = 0. Albert and Rath (2020) show that such a covariance kernel must satisfy…”
Section: Specialized Kernel Constructionmentioning
confidence: 99%
See 1 more Smart Citation
“…In this sense, the approach is similar to that of Section 6.2, although the kernel is not obtained from a transformation of a prior kernel but rather is constructed from solutions to the problem Lu = 0. Albert and Rath (2020) show that such a covariance kernel must satisfy…”
Section: Specialized Kernel Constructionmentioning
confidence: 99%
“…They point out that convolution kernels can also be considered. Albert and Rath (2020) study the Laplace, heat, and Hemholtz equations, performing MLE and inferring the solution u from the PDE constraint and scattered observations. They construct kernels of the form E41, which satisfy Eq.…”
Section: Specialized Kernel Constructionmentioning
confidence: 99%
“…[5] In regression with GPs, the posterior distribution of unknown values is inferred, making it a powerful tool for nonlinear multivariate interpolation. [6] Correlations over time are captured by a covariance function or kernel, which may be designed for specific tasks (e.g., to fulfill laws of physics [7,8] ). A generalization of GPs are TPs [9] allowing a more robust estimation of mean and variance for data with outliers.…”
Section: Introductionmentioning
confidence: 99%
“…While Graepel [10] considered boundary conditions together with differential equation constraints, the approach involved performing regression for a factorized representation of the solution which was constructed in special domains and for Dirichlet conditions; a straightforward construction for general domains was not provided, nor was co-kriging considered. Other related work include those of Owhadi [12], who considered Bayesian numerical homogenization, and Albert and Rath [13], who utilized covariance kernels in the form of a Mercer expansion to enforce PDE constraints. In addition to our unique framework, we go beyond the examples of Solin and Kok [11] by considering general mixed boundary conditions, such as Dirichlet conditions in certain regions of ∂Ω and Neumann conditions in other regions.…”
Section: Introductionmentioning
confidence: 99%