The forthcoming generation of many-core architectures suggests a strong paradigm shift in the way algorithms have been designed to achieve maximum performance in reservoir simulations. In this work, we propose a novel poly-algorithmic solver approach to develop hybrid CPU multicore and GPU computations for solving large sparse linear systems arising in realistic black oil and compositional flow scenarios. The GPU implementation exploits data parallelism through the simultaneous deployment of thousands of threads while reducing memory overhead per floating point operations involved in most BLAS kernels and in a suite of preconditioner options such as BILU(k), BILUT and multicoloring SSOR. On the other hand, multicore CPU computations are used to exploit functional parallelism to perform system partitioning and reordering, algebraic multigrid preconditioning, sparsification and model reduction operations in order to accelerate and reduce the number of GCR iterations. The efficient orchestration of these operations relies on carefully designing the heuristics depending on the timestep evolution, degree of nonlinearity and current system properties. Hence, we also propose several criteria to automatically decide the type of solver configuration to be employed at every time step of the simulation. To illustrate the potentials of the proposed solver approach, we perform numerical computations on state-of-the-art multicore CPU and GPU platforms. Computational experiments on a wide range of highly complex reservoir cases reveal that the solver approach yields significant speedups with respect to conventional CPU multicore solver implementations. The solver performance gain is of the order of 3x which impacts in about 2x the overall compositional simulation turnaround time. These results demonstrate the potential that many core solvers have to offer in improving the performance of near future reservoir simulations.
Abstract. Current many-core GPUs have enormous processing power, and unlocking this power for general-purpose computing is very attractive due to their low cost and efficient power utilization. However, the fine-grained parallelism and the stream-programming model supported by these GPUs require a paradigm shift, especially for algorithm designers. In this paper we present the design of a GPU-based sparse linear solver using the Generalized Minimum RESidual (GMRES) algorithm in the CUDA programming environment. Our implementation achieved a speedup of over 20x on the Tesla T10P based GTX280 GPU card for benchmarks with from a few thousands to a few millions unknowns.
This presentation outlines an integrated workflow that incorporates 4D seismic data into the Ekofisk field reservoir model history matching process. Successful application and associated benefits of the workflow benefits are also presented. A seismic monitoring programme has been established at Ekofisk with 4D seismic surveys that were acquired over the field in 1989, 1999, 2003, 2006 and 2008. Ekofisk 4D seismic data is becoming a quantitative tool for describing the spatial distribution of reservoir properties and compaction. The seismic monitoring data is used to optimize the Ekofisk waterflood by providing water movement insights and subsequently improving infill well placement. Reservoir depletion and water injection in Ekofisk lead to reservoir rock compaction and fluid substitution. These changes are revealed in space and time through 4D seismic differences. Inconsistencies between predicted 4D differences (calculated from reservoir model output) and actual 4D differences are therefore used to identify reservoir model shortcomings. This process is captured using the following workflow: (1) prepare and upscale a geologic model, (2) simulate fluid flow and associated rockphysics using a reservoir model, (3) generate a synthetic 4D seismic response from fluid and rock physics forecasts, and (4) update the reservoir model to better match actual production/injection data and/or the 4D seismic response. The above-mentioned Seismic History Matching (SHM) workflow employs rock-physics modeling to quantitatively constrain the reservoir model and develop a simulated 4D seismic response. Parameterization techniques are then used to constrain and update the reservoir model. This workflow updates geological parameters in an optimization loop through minimization of a misfit function. It is an automated closed loop system, and optimization is performed using an in-house computer-assisted history matching tool using evolutionary algorithm. In summary, the Ekofisk 4D SHM workflow is a multi-disciplinary process that requires collaboration between geological, geomechanical, geophysical and reservoir engineering disciplines to optimize well placement and reservoir management.
The concept of uncertainty, risk, and probabilistic assessment is increasingly employed as a standard in the E&P industry to assist in development and investment decisions. The Tor field in the Greater Ekofisk Area of the North Sea is a producing chalk field, which has a 35-year production history and aging facilities. This naturally fractured chalk reservoir has had limited water injection and experienced rapid decline. An integrated subsurface uncertainty study has been performed to support a potential redevelopment of the Tor field. This paper will demonstrate the integrated workflow for the uncertainty study and the methodologies used to overcome challenges in reservoir modeling and forecasting. The results of the sensitivity analysis and assisted history matching (AHM) process will be illustrated as well as how the results were applied in the evaluation of redevelopment options and in preparing future reservoir management plan. The main challenges in reservoir modeling, forecasting and overall evaluation of the Tor field are: 1) Uncertainties outside the well control area. This results in a significant structure uncertainty, hence an even more increased uncertainty in structural dependent properties. 2) Uncertainty and implementation of inter-dependent static properties and their spatial distribution. The deterministic base case model is only one of thousands of property realizations from the geostatistical modeling process. 3) Uncertainty and systematic implementation of effective permeability. Effective permeability in the chalk reservoir is a combination of enhanced matrix permeability and "highways". Predictability of potential "highways" not identified by existing wells is especially challenging. 4) Simulation time. These uncertainties will directly influence the determination of hydrocarbon in place, well placement, and waterflooding efficiency and add risk to the production forecast used to justify field redevelopment. The workflow was: 1) Identification and framing of uncertainty parameters. 2) Complete static and dynamic parameters analysis and integration. 3) Comprehensive sensitivity analysis and AHM. 4) Forecasting based on multiple calibrated models to reach the rigorous probabilistic production profiles. The approach used include: 1) Realization of structure uncertainty and associated properties by a robust approach, which is advantageous for the AHM process. 2) Employment of multiple property realizations. 3) Use of a 3D seismic attribute for capturing potential highways uncertainty and for systematic effective permeability implementation. 4) Addressing uncertainty in water flood sweep efficiency. From the integrated workflow and robust methodology, a suite of "good quality" AHM models with equal probability are obtained. AHM has narrowed down the uncertainty range and from post-AHM analysis the initial resource range and main influential parameters on development are determined. As one of the best practices, we recommend using across sampled representative models with well & operation uncertainties rather than a specific P10, P50 or P90 model to make final probabilistic forecasts.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.