Over 60 % of the world's daily oil production today comes from giant fields, where production is on decline (Robelius, 2007). Extended development and management of such complex projects would require integrated and cost efficient approach. While there are many theories on integration of reservoir and network models, there is very limited experience on the actual application of such models to field development plans (FDP) as routine decision making tools. This paper presents successful case study of creating robust and fast integrated model of a giant gas-condensate field with complex gathering network and processing facilities. Use of integrated asset modelling (IAM) approach has proven to bring value over conventional "standalone" modelling in this particular complex system. This system includes three inter-flowing processing facilities simultaneously constrained by processing capacities, gas compression for re-injection and overall export agreements. In addition, all the wells (about 100 producers and 20 injectors) have individual constraints (liquid rate and drawdown limits for producers, injection rate and pressure limits for gas re-injection wells). Major technical challenges resulting from the complexity of such modelling were discussed here. Increase of stability (honouring all constraints), accuracy (reproducing actual data) and speed (4-5 min per timestep) were achieved by imbedding custom algorithms and workflows into IAM while using fully compositional Eclipse and complex GAP network models. One of the major challenges was to properly model and optimize inter-unit flows and properly prioritize low GOR wells. In this paper a project from FDP is used as an example to demonstrate the benefits and limitations of integrated modelling approach. Based on the case study example the integrated model has proved to bring incremental value by providing more accurate production profiles and reducing development costs by avoiding "overdesigning". These and other conclusions are discussed in the paper in the context of the actual project.
Reservoir studies, including preparation of field development plan, are processes typically dominated by time constraints. In general, reservoir studies consist in multiple geoscience activities integrated to build a fine geological model that eventually leads to an upscaled numerical model suitable for history matching and forecast simulations. In the simulation stage, the quality and effectiveness of the activity is highly dependent on the computational efficiency of the numerical model. This is particularly true for complex, supergiant carbonate reservoirs. Often, even with today's simulators, upscaling is still needed and simplifications can be implemented to allow computationally intensive history matching and risk analysis workflows. This paper provides some real field examples where these issues were faced and successfully managed, without applying further simplifications to the geological concept of the model: principles of reservoir simulations and common sense reservoir engineering were used to adjust properties of the model and then speed-up numerical simulation. Tuning included a combination of various solutions, such as deactivating critical cells whenever possible, calibrating convergence and time stepping control, tweaking field management to prevent instability in the computation, optimization of number of cores and cells split among cores to optimize load balancing and scalability. These solutions were used on two super-giant carbonate fields, a triple porosity (matrix, karst and fractures) undersaturated light oil reservoir and a supercritical gas and condensate reservoir. The former field was described using an upscaled model of about 700,000 active cells and a dual porosity - dual permeability formulation; the latter was described by a relatively coarse model of about 400 thousand active cells using a single porosity formulation. Large speed-up, up to five times with respect to reference simulations, was achieved without simplifying the geology and losing accuracy perceivably. Benefits were achieved for both conventional and high-resolution simulators.
This paper presents a successful case study of building a robust “in-situ” Integrated Production Systems Model (IPSM) directly in Reservoir simulator (ECLIPSE) without adding complexity of third-party network simulators (e.g. PETEX) and scripts. Based on the case study, ECLIPSE based IPSM model has proved to add incremental value over standalone ECLIPSE model by providing more accurate production profiles and reducing development costs by avoiding “under-designing” or “over-designing” of project facilities. High level group tree had been replaced by the detailed architecture that is well aligned with the actual surface field network layout. Considering the large number of production nodes, HFP (Horizontal flow performance) tables for existing and future pipelines generated and updated using the in-house developed automatic workflow based on OPEN SERVER platform and linked to the nodal analysis software. Flow rate-pressure calibration of simulated data to observed data across all 400+ manifolds, pipes and well chokes are performed beforehand based on regularly updated data from internal real-time production data management system. The current reservoir pressure at Karachaganak gas condensate field (KGK) is below the saturation point. Producing Gas-Oil ratio and water cut at well and field level increases over production life that in turn creates bottlenecks in the surface gathering system. The production system includes around 200 production and injection wells connected to three inter-flowing processing facilities, which simultaneously constrained by gas and water processing capacities, gas compression for re-injection and contractual gas sale obligations. Recent practice for assessment of oil plateau extension projects in the field was to use either standalone ECLIPSE model option or full-scale Integrated Asset Model (IAM) based on integration of RESOLVE, GAP and ECLIPSE. ECLIPSE based IPSM model reproduced historical pressure losses of the surface pipelines at similar resolution of IAM model at much shorter computational runtimes. It also improved the rate-pressure transition from history to forecast in comparison with standalone results. In addition, new model setup helped to identify bottlenecks in the well flowlines and to propose solutions by properly rerouting the wells. Furthermore, significant CAPEX cost savings achieved by finding optimal size and number of trunklines from manifolds to Processing plants. The novelty of this production-forecasting tool is the generation and integration of detailed surface gathering system into dynamic model without using third-party network simulators and to bring its accuracy to the levels sufficient not only for the long-term forecasting but also for medium-term and short-term optimization work. Mainly, this study is vital for a big oil and gas fields, where high precision is the critical for decision making and production success.
This paper presents a practical methodology for the determination of error bounds in the oil back allocation procedure and its application to decision making in the development of a giant gas condensate reservoir in the Caspian region. The case study of allocating fluid rates to individual wells has been performed by adjustment of reservoir pressure and gas-oil ratio with time using computer aided analysis.This work describes improvements in well performance modelling and outlines the effective workflow whereby well performance can be estimated at any given period. The workflow has provided a cost-effective solution that reduces uncertainties in estimating production volumes at the well level. Well models are built using production well test data. These tests are usually performed at best on an annual basis for fields with a large number of producing wells. Therefore, the well models are nearly always out of date by a matter of months or even years, and the allocation factor associated with traditional models could be over the reasonable accuracy level. In addition to traditional models well performance software models have been developed and used. By adjusting the software models for predictable changes in reservoir pressure and gas oil ratio, the oil allocation factor errors can be reduced significantly. This methodology can also be applied to historical data to improve the production assumptions used to history match the field simulation model A comparative study of the different allocation methods and its impact to allocation factor errors and to net present values has been carried out, and confirmed the importance of correct oil reconciliation at the well level. IntroductionReconciliation of fiscally measured hydrocarbon production with estimated production from the associated production wells is common practice in the oil industry, particularly in gas condensate wells. This process is known as "allocation" and is important for a number of reasons including field surveillance and volumetric input to reservoir simulators (Cramer et al., 2009). Accurate production volumes availability at the outlet of the production network and at the well level is essential to the workflows which targets the economic potential optimization of the reservoir performance (Stundner and Nunez, 2006). Correct prediction of the reservoir performance helps to support operational decisions for the field development schedule and maximize reservoir's economic value (Ibrahim, 2008). Inaccuracies in production allocation consequently decrease the prediction capability of the reservoir simulator model used in investment decision making processes such as whether or not to drill more wells.Conventional gravity based test separators are used for well-testing which can lead to errors during the measurement of multiphase fluid flow from a well (API, 2005) and also leads to errors in the allocation of the fluid at the time of the well test eg liquid carry-over can cause an erroneous increase in the GOR calculation. Measurement errors of this na...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.