2017
DOI: 10.48550/arxiv.1712.09685
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural network augmented inverse problems for PDEs

Jens Berg,
Kaj Nyström

Abstract: In this paper we show how to augment classical methods for inverse problems with artificial neural networks. The neural network acts as a prior for the coefficient to be estimated from noisy data. Neural networks are global, smooth function approximators and as such they do not require explicit regularization of the error functional to recover smooth solutions and coefficients. We give detailed examples using the Poisson equation in 1, 2, and 3 space dimensions and show that the neural network augmentation is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
23
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(25 citation statements)
references
References 17 publications
1
23
1
Order By: Relevance
“…The dependence of C rand on the basis. Lemma 1 shows that the ONB used to randomise our input plays a role, as it appears explicitly in the formula (9). The following proposition underscores that point.…”
Section: 22mentioning
confidence: 66%
“…The dependence of C rand on the basis. Lemma 1 shows that the ONB used to randomise our input plays a role, as it appears explicitly in the formula (9). The following proposition underscores that point.…”
Section: 22mentioning
confidence: 66%
“…Being universal approximators, neural networks have been widely used in nonlinear system identification: depending on the architecture and on the properties of the loss function, they can be used as sparse regression models, they can act as priors on unknown coefficients or completely determine an unknown differential operator. Many kind of architectures have been used for system identification, among which multi-layer feed forward networks (see for instance [16], [13], [3], [22], [4]) and recurrent networks and its variants which have been used in dynamic identification of nonlinear systems because of their ability to retain information in time across layers (see for instance [30], [8], [19], [21]).…”
Section: Introductionmentioning
confidence: 99%
“…Jianyu et al [12] used ANN with a radial basis function as an activation function for Poisson equations. More recently, Berg et al in [4] used a DNN to solve steady (time-independent) problems for a domain with complex geometry in one and two space dimensions, and later in [3] they studied DNN architectures to solve augmented Poisson equations (inverse problems) including three space dimensions.…”
Section: Introductionmentioning
confidence: 99%