2017
DOI: 10.1190/tle36080654.1
|View full text |Cite
|
Sign up to set email alerts
|

Sparse seismic wavefield sampling

Abstract: Seismic acquisition is a trade-off between image quality and cost. While there is an increasing need for higher quality images due to the more complex geologic settings of reservoirs, there is also a strong desire to reduce the cost and cycle time of seismic acquisition. Meeting these conflicting ambitions requires creative solutions. New hardware developments aim at improving survey efficiency and image quality. To optimally leverage new hardware and maximize survey efficiency, their development should go tog… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(2 citation statements)
references
References 23 publications
0
2
0
Order By: Relevance
“…can minimize the maximum mutual coherency of a dictionary matrix such that densely sampled data can be effectively recovered from sparsely sampled data. Campman et al (2017) also showed its applicability to design activation times of overlapped sources instead of the use of time dithering from a random realization. Additionally, Wu, Blacquière and Groenestijn (2015) introduced blended acquisition using shot repetition that activates multiple shots at the same location within a short time interval.…”
Section: Figure 14mentioning
confidence: 99%
“…can minimize the maximum mutual coherency of a dictionary matrix such that densely sampled data can be effectively recovered from sparsely sampled data. Campman et al (2017) also showed its applicability to design activation times of overlapped sources instead of the use of time dithering from a random realization. Additionally, Wu, Blacquière and Groenestijn (2015) introduced blended acquisition using shot repetition that activates multiple shots at the same location within a short time interval.…”
Section: Figure 14mentioning
confidence: 99%
“…Mueller et al (2016) describe a method for optimizing near-orthogonal source codes using a simulated annealing algorithm. Campman et al (2017) use the so-called Golomb Ruler to optimize the shot-firing time in an algebraic way such that the correlation property is maximized. In the case of shot repetition, we use a trial-and-error algorithm to optimize the orthogonal properties of the blending code, which means that we aim to obtain source code pairs with spiky autocorrelation and minimal crosscorrelation.…”
Section: Source-code Optimizationmentioning
confidence: 99%