2019
DOI: 10.1111/1365-2478.12849
|View full text |Cite
|
Sign up to set email alerts
|

Imaging of elastic seismic data by least‐squares reverse time migration with weighted L2‐norm multiplicative and modified total‐variation regularizations

Abstract: Least‐squares reverse time migration has the potential to yield high‐quality images of the Earth. Compared with acoustic methods, elastic least‐squares reverse time migration can effectively address mode conversion and provide velocity/impendence and density perturbation models. However, elastic least‐squares reverse time migration is an ill‐posed problem and suffers from a lack of uniqueness; further, its solution is not stable. We develop two new elastic least‐squares reverse time migration methods based on … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 18 publications
(6 citation statements)
references
References 53 publications
(85 reference statements)
0
6
0
Order By: Relevance
“….The process of observation signal adopted here is non adaptive, and the measurement matrix  does not depend on the structure of the signal. By solving the optimization problem under 0  norm [16], the signal can be reconstructed accurately,that is 0 min . .…”
Section: The Basic Theory Of Mcs Methodsmentioning
confidence: 99%
“….The process of observation signal adopted here is non adaptive, and the measurement matrix  does not depend on the structure of the signal. By solving the optimization problem under 0  norm [16], the signal can be reconstructed accurately,that is 0 min . .…”
Section: The Basic Theory Of Mcs Methodsmentioning
confidence: 99%
“…Methods to exploit the joint sparsity attainable from the multiple measurement vectors (MMVs) have been subsequently developed in [21,18,25,72,23,57], and somewhat relatedly, additional refinement to (10) can be made by employing weighted ℓ 1 or ℓ 2 regularization, [15,20,29,55,53]. Both approaches have the effect of more heavily penalizing sparse regions in the sparse domain of the solution than in locations of support.…”
Section: Sparse Bayesian Learningmentioning
confidence: 99%
“…Another more effective solution is to add the sparse regularization item (e.g. total variation, Lin & Huang 2015; Ren & Li 2019 and L 1 norm, Guitton 2012) into the misfit function to avoid the data overfitting and mitigate these artefacts.…”
Section: Numerical Examplesmentioning
confidence: 99%