2022
DOI: 10.1190/geo2021-0491.1
|View full text |Cite
|
Sign up to set email alerts
|

Least-squares reverse time migration via deep learning-based updating operators

Abstract: Two common issues of least-squares reverse time migration (LSRTM) consist of the many iterations required to produce substantial subsurface imaging improvements and the difficulty of choosing adequate regularization strategies with optimal hyperparameters. We investigate how supervised learning can mitigate these shortcomings by solving the LSRTM problem through an iterative deep learning framework inspired by the projected gradient descent algorithm. In particular, we develop an image-to-image approach interl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 20 publications
(5 citation statements)
references
References 79 publications
0
5
0
Order By: Relevance
“…Various DNN models, trained by synthetic datasets, have been successfully applied to solve multiple geophysical problems including seismic denoising ( 7 , 22 , 72 , 73 ), seismic migration ( 36 , 40 ), velocity model building ( 25 , 29 , 42 , 43 , 45 , 74 , 75 ), impedance inversion ( 62 , 76 ), and seismic interpretation ( 52 , 71 , 77 , 78 ). As shown in Fig.…”
Section: Imposing Constraints On Datamentioning
confidence: 99%
See 1 more Smart Citation
“…Various DNN models, trained by synthetic datasets, have been successfully applied to solve multiple geophysical problems including seismic denoising ( 7 , 22 , 72 , 73 ), seismic migration ( 36 , 40 ), velocity model building ( 25 , 29 , 42 , 43 , 45 , 74 , 75 ), impedance inversion ( 62 , 76 ), and seismic interpretation ( 52 , 71 , 77 , 78 ). As shown in Fig.…”
Section: Imposing Constraints On Datamentioning
confidence: 99%
“…During the past decade, DNNs have been widely used in various geophysical research areas ( 1 – 5 ), including seismology ( 6 14 ), atmospheric science ( 15 17 ), and planetary and space science ( 18 20 ). DNNs have been particularly intensively studied in exploration of geophysics to accelerate and advance the entire workflow of data processing ( 21 24 ), tomography ( 25 29 ), forward modeling ( 30 – 35 ), migration ( 36 40 ), velocity model building ( 41 47 ), and interpretation ( 48 – 53 ).…”
mentioning
confidence: 99%
“…Instead of using 1D models and noisy traces, the networks were trained with synthetically generated 2D reflectivity models and their corresponding noisy data slices to improve lateral continuity and robustness to noise. We follow Torres and Sacchi (2022a) to generate synthetic reflectivity models (as shown in Figure 5) considering the spatial correlation imposed by depositional processes while mimicking fractured, faulted and folded sedimentary structures that can take arbitrary shapes and orientations. The training samples reflect that physical, nonrandom processes produce the earth's geology, and subsurface layers can have many arbitrary orientations, including sharp discontinuities.…”
Section: Numerical Examplesmentioning
confidence: 99%
“…Such networks bypass the use of explicit physics operators but rely on a vast amount of training samples to learn the underlying physics of the problem. To reduce the dependency on training data, learned iterative schemes (Torres & Sacchi, 2022a) incorporate physics into the learning process and replace various components of unrolled iterative reconstruction algorithms with neural network computations. Alternatively, to avoid iterations, the learned post‐processing method first maps the measurements to the model space through a known physics operator (either the pseudo‐inverse L:RmRn$\mathbf {L}^\dagger : \mathbb {R}^m\rightarrow \mathbb {R}^n$ or an approximation to it) and then trains a neural network to learn a model perturbation that potentially improves this initial reconstruction (Kaur et al., 2020; Zhang et al., 2021).…”
Section: Introductionmentioning
confidence: 99%
“…[30] also used CNNs on dip-angle domain elastic reverse-time migration to improve image quality; Ref. [31] introduced the idea on minibatch LSRTM, and Torres and Sacchi [32,33] used blocks of residual CNN on LSRTM with a preconditioned conjugate gradient least-squares algorithm (CGLS) to enhance image resolution; Ref. [34] used deep learning for accelerating prestack correlative LSRTM.…”
Section: Introductionmentioning
confidence: 99%