2021
DOI: 10.1029/2020wr027400
|View full text |Cite
|
Sign up to set email alerts
|

Deep‐Learning‐Based Adjoint State Method: Methodology and Preliminary Application to Inverse Modeling

Abstract: We present an efficient adjoint model based on the deep‐learning surrogate to address high‐dimensional inverse modeling with an application to subsurface transport. The proposed method provides a completely code nonintrusive and computationally feasible way to approximate the model derivatives, which subsequently can be used to derive gradients for inverse modeling. This conceptual deep‐learning framework, that is, an architecture of deep convolutional neural network through combining autoencoder and autoregre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 35 publications
0
13
0
Order By: Relevance
“…Traditional optimization methods also benefit from the autodifference mechanism in DL, which makes optimization more efficient by replacing conjugate gradient descent or LBGFS with DL optimization methods, such as SGD and Adam (Sun, Niu, et al, 2020;Wang, Chang, et al, 2020). DL also inspired new directions in the study of traditional nonlinear optimization algorithms, such as ML-descent (Sun and Alkhalifah, 2020) and DL-based adjoint state methods (Xiao et al, 2021).…”
Section: Combination Of DL and Traditional Methodsmentioning
confidence: 99%
“…Traditional optimization methods also benefit from the autodifference mechanism in DL, which makes optimization more efficient by replacing conjugate gradient descent or LBGFS with DL optimization methods, such as SGD and Adam (Sun, Niu, et al, 2020;Wang, Chang, et al, 2020). DL also inspired new directions in the study of traditional nonlinear optimization algorithms, such as ML-descent (Sun and Alkhalifah, 2020) and DL-based adjoint state methods (Xiao et al, 2021).…”
Section: Combination Of DL and Traditional Methodsmentioning
confidence: 99%
“…corresponding to n = N d , ⋅⋅⋅,1, and an additional condition 𝐴𝐴 𝝀𝝀𝑁𝑁 𝑑𝑑 +1 = 0. The details related to the mathematical derivation of the adjoint-state method can be found in our previous work (Xiao, Deng, & Wang, 2021). Once the gradient 𝐴𝐴 𝑑𝑑𝑑𝑑 𝑑𝑑𝐦𝐦 is available, the parameters can be iteratively updated based on gradient-based optimizations, for example, quasi-Newton or Gaussian-Newton algorithms (Nocedal, 1999).…”
Section: Formula Of Adjoint-based Inverse Modelingmentioning
confidence: 99%
“…Wu & Lin, 2020). Detailed review can be found in the literature (Tamaddon-Jahromi et al, 2020;Wang et al, 2021) as well as our previous work (Xiao, Deng, & Wang, 2021). It can be inferred that the research advancement has progressed in the field of machine or deep learning, which can be applied to adjoint development to make it more streamlined.…”
mentioning
confidence: 99%
“…Beyond the goal of accelerating compute capabilities, such physics-informed neural networks may offer other advantages such as grid independence, lowmemory overhead, differentiability, and on-demand solutions. These properties facilitate deep learning being used to solve geophysical inverse problems (Zhu et al, 2020;Smith et al, 2021;Xiao et al, 2021;Zhang and Gao, 2021), as a wider selection of algorithms and frameworks then are available for use, such as approximate Bayesian inference techniques like variational inference.…”
Section: Introductionmentioning
confidence: 99%