In the past few years, significant progress has been made on new velocity analysis algorithms. In the first part of this paper, we will briefly summarize recent advances on velocity analysis. Then we describe a new model-based globally-optimized residual curvature analysis algorithm we have just developed. Like conventional residual curvature analysis, the algorithm is based on the principle that after prestack migration with a correct velocity model, an image in the common image point (CIP) gather is aligned horizontally regardless of structure. Unlike conventional residual curvature analysis, this algorithm uses not only the interpreted CIP gathers, but also the interpreted migrated depth section as input. The algorithm is model-based, and uses modelbased CIP ray tracing to relate residual moveouts in CIP gathers to errors in the velocity model. Residual moveouts measured in CIP gathers are globally used in the optimization process for updating the whole velocity model. Also model-based normal incident ray tracing is used for updating the reflector boundaries. REVIEW OF RECENT ADVANCES OF VELOCITY ANALYSISRecent advances of velocity analysis can be summarized into two categories: 1) traveltime inversion; 2) migration velocity analysis. Traveltime inversionTraveltime Inversion (TI) (Bishop et al., 1985; Stork and Clayton, 1991) estimates a depth velocity model from traveltimes picked from prestack data. The main advantage of TI is that it is formulated as an optimization problem and therefore model updating is very effective and efficient. However, in areas of complex geological structure, picking prestack traveltimes in surface seismic data is almost unfeasible. Picking prestack traveltimes may have the following problems: 1) in the case of complex reflector geometry, seismic energies reflected from different parts of a reflector may arrive at the same receiver location; 2) reflection arrivals contaminated by diffraction energy; 3) low signal-to-noise ratio often associated with complex structure.In the past few years, a few researchers tried to solve the traveltime picking problem. IFP (Institut Francais du Petrole) developed a method called SMART (Sequential Migration Aided Reflection Tomography) (Delprat-Jannaud and Lailly, 1993) to solve the traveltime picking problem. The main idea of SMART is to use an approximate velocity model to migrate seismic data, then pick the imaged reflectors in the cube of migrated shot gathers, finally trace rays that propagate in the same velocity model as the one used for the migration and that are reflected on the picked imaged reflectors. They claimed that because the ray tracing undoes what the migration has done, even with an approximate velocity model they can recover traveltimes. More recently DATAID (1994) used a similar approach as IFP, but performed migration and raytracing in the common-offset gather instead of common-shot gather. The main point of these approaches is that instead of directly picking events in the time domain, picking is done after depth migration...
We found that use of CTU for renal colic was significantly reduced by introduction of a guideline promoting ultrasound and encouraging selective CTU. Although intervention rates were similar between the two sites, further prospective study is needed to ensure other patient-centred outcomes do not differ.
Production data analysis and reservoir simulation of the Eagle Ford shale are very challenging due to the complex characteristics of the reservoir and the fluids. Eagle Ford reservoir complexity is expressed in the enormous vertical and horizontal petro-physical heterogeneity, stress-sensitive permeability, and existence of multi-scale natural fracture and fault systems. This complexity makes the prediction of the geometry and conductivity of the hydraulic fracture resulting from the stimulation process rather challenging. On the other hand, reservoir fluid complexity is demonstrated in multi-phase flow, liquid loading in the wellbore, condensate banking, etc. Based on this complexity, 3D reservoir modeling and numerical simulation have the relative advantage of addressing irregular fracture geometry, variable SRV, and multi-phase flow aspects. The South Texas Asset Team at Pioneer Natural Resources is establishing a workflow for dynamic reservoir modeling that can integrate all reservoir/wellbore parameters (formation evaluation, drilling, completion, stimulation, pre-/post-fracture surveillance, and well performance data) in order to address key questions relating to field development; such as depletion efficiency, drainage area, wells interference, and condensate banking effects. In this paper, a case study is presented to demonstrate the integration of various measurements and surveillance data to build a variable SRV reservoir model. The variable SRV model described here has the following building blocks: 1) Formation evaluation: included all the reservoir characterization data derived from logs and 3D seismic inversions and structural attributes. 2) Surveillance data integration: microseismic data (backbone for this work) are integrated with chemical and radioactive tracer logs. 3) Well performance data integration: Production data is used to determine different flow regimes during the well history and to set bounds for stimulation parameters, such as fracture half-length and permeability ( √ ). 4) Numerical simulation: Micro-seismic attributes (density and magnitude) are converted to a permeability model after being calibrated with tracer logs and production flow regime parameters ( √ ). PVT data is matched against an Equation of State (EOS) and input into the model. Production data history matching, sensitivity and forecasting indicate the following: a) The SRV created by fracture stimulation has permeability fading away from the wellbore; b) Fracture geometry is variable and results in an irregular drainage area along the lateral; C) Onset of condensate banking near wellbore and along the fracture(s) can occur within the first year of production if draw down is not managed properly.
Sinusoidal noise often contaminates seismic data. When this noise is large compared to seismic signals, it adversely affects prestack seismic processing and subsequent interpretation. We develop a digital least‐squares filtering algorithm for canceling stationary sinusoidal noise in seismic data. The method effectively cancels sinusoidal noise when the noise is stationary, which is typical for recordings of a few seconds in length. This procedure differs from the usual notch‐filtering techniques because the sinusoidal noise is canceled without notching the signal spectrum. Since the method requires that the line frequency be accurately known, the algorithm can automatically search the trace spectrum to find the exact sinusoidal frequency value needed for filter design. The algorithm is highly automated and requires no input parameters when the interference comes from power lines or generators. We use model and field data to quantify the algorithm’s performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.