2020
DOI: 10.1190/geo2019-0138.1
|View full text |Cite
|
Sign up to set email alerts
|

A theory-guided deep-learning formulation and optimization of seismic waveform inversion

Abstract: Deep-learning techniques appear to be poised to play very important roles in our processing flows for inversion and interpretation of seismic data. The most successful seismic applications of these complex pattern-identifying networks will, presumably, be those that also leverage the deterministic physical models on which we normally base our seismic interpretations. If this is true, algorithms belonging to theory-guided data science, whose aim is roughly this, will have particular applicability in our field. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
48
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 172 publications
(49 citation statements)
references
References 56 publications
1
48
0
Order By: Relevance
“…A time-domain acoustic wave ISP is solved in [33], which develops a theory-designed RNN. It finds that training such a network and updating its weights amount to a solution of the ISP and the process is equivalent to a gradient-based full-waveform inversion.…”
Section: Other Approachesmentioning
confidence: 99%
See 1 more Smart Citation
“…A time-domain acoustic wave ISP is solved in [33], which develops a theory-designed RNN. It finds that training such a network and updating its weights amount to a solution of the ISP and the process is equivalent to a gradient-based full-waveform inversion.…”
Section: Other Approachesmentioning
confidence: 99%
“…A summary of papers that solve inverse scattering problems using deep learning schemes. Note that [32] and [33] that belong to "other approaches" are not listed in the figure.…”
Section: Other Approachesmentioning
confidence: 99%
“…As for the network hyper-parameters, we follow the discussion in [34] and set the learning rate lr = 40 for the Adam optimizer with the number of epochs equal to 50. Note that there is no activation function or normalization applied in the SWINet, a larger learning rate will benefit the convergence of the Adam optimizer.…”
Section: A Dataset and Training Parameter Setupmentioning
confidence: 99%
“…On the other hand, the promising inversion performance of traditional FWI in synthetic and actual complex models, indicates the great power of considering the governing wave equation in seismic inversion. Thus, another possible solution to generalize the seismic deep learning inversion to large complex models is to incorporate the relevant physical nature rules into the purely data-driven methods and form a physicsbased deep learning concept [33], [34]. In this concept, a seismic forward modeling operator needs to be designed and imposed into the deep learning inversion architecture, which will relate the basic generation rules and features of seismic observation data with the learning process, and as a result, may lead the optimization of network parameters to a more stable and reasonable direction.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore the complexity of NML in both theoretical and computational aspects will increase when the latent space dimension increases. Hughes et al (2019) and Sun et al (2020) showed that the wave-equation modeling is equivalent to the recurrent neural network (RNN) and the FWI gradient can be automatically calculated by the AD. Because CAE training also relies on the AD, therefore the AD is a perfect tool to numerically connect a CAE architecture to the wave-equation inversion.…”
Section: Hybrid Machine Learning Inversionmentioning
confidence: 99%