Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
One of the key objectives doing well testing is to derive effective reservoir properties, such as permeability, to provide input for reservoir simulation. Traditional approach in well test analysis is first, to separate transient pressure draw down and build ups' due to constant flowing rate, then analyze them by forcing a pre-selected model to derive effective reservoir permeability through an inversion process. However, the results obtained this way are not really dynamic or at most is "pseudo-dynamic", because these are based on the fact that a continuous signal was broken into discontinuous signals. Some information (more than a changed permeability value) of the reservoir system response may already lose, while uncertainties were increased or multiplied in the process of data evaluation and analysis. This paper presents a new approach, which will allow for continuous reservoir system response analysis and interpretation. Therefore the information derived from such analysis can reflect the real time, dynamic changes of the reservoir. This has been proved more powerful in analyzing transient pressure, particularly for that from permanent down hole gauges (PDG). Numerical well testing procedures, assisted by neural network method, were used to analyze the data from simulated and field cases in a continuous, systematic fashion. First, the flowing rate history was recovered from the measured transient pressure. Transient pressure histories are then re-produced through simulations of both well test forward model and neural network black box model. A match between measured pressure response (PDG data) and re-produced transient pressure histories will then be made, to achieve the ultimate goal of well testing - real time reservoir monitoring, model calibration and reservoir management. This final matching process was named "guided (by neural network model) history matching". Introduction Along with the advent of reliable permanent down-hole gauge available to the oil industry, the importance of continuous reservoir monitoring in mature field and its value delivered by up-to-date information has been recognized as the essential element in modern reservoir management (Athichanagom et al., 1999; Rossi et al., 2000; Chiriti et al., 2001; Ballinas and Owen, 2002; Haddad et al., 2004; Olsen and Nordtvedt, 2005; Weaver et al., 2005; Chorneyko 2006; Frota and Destro, 2006; Olsen and Nordtvedt, 2006). However, the key to ensure the information delivered correctly is a robust analysis method, which can, not only handling the large, noisy data set, but also distil the "message" from the dynamic data for right decision making. Quite a few studies have been released in recent years using wavelet algorithm (Kikani and He, 1998; Soliman et al., 2001; Ouyang and Kikani, 2002; Guan et al., 2004; Olsen and Nordtvedt, 2005; Ribeiro et al., 2006; Zheng and Li, 2007), however, these are more about pre-analysis data processing, the analysis method used after data processing is more or less based on constant terminal rate pressure Draw-Down (DD) solution being regarded as traditional approach.
One of the key objectives doing well testing is to derive effective reservoir properties, such as permeability, to provide input for reservoir simulation. Traditional approach in well test analysis is first, to separate transient pressure draw down and build ups' due to constant flowing rate, then analyze them by forcing a pre-selected model to derive effective reservoir permeability through an inversion process. However, the results obtained this way are not really dynamic or at most is "pseudo-dynamic", because these are based on the fact that a continuous signal was broken into discontinuous signals. Some information (more than a changed permeability value) of the reservoir system response may already lose, while uncertainties were increased or multiplied in the process of data evaluation and analysis. This paper presents a new approach, which will allow for continuous reservoir system response analysis and interpretation. Therefore the information derived from such analysis can reflect the real time, dynamic changes of the reservoir. This has been proved more powerful in analyzing transient pressure, particularly for that from permanent down hole gauges (PDG). Numerical well testing procedures, assisted by neural network method, were used to analyze the data from simulated and field cases in a continuous, systematic fashion. First, the flowing rate history was recovered from the measured transient pressure. Transient pressure histories are then re-produced through simulations of both well test forward model and neural network black box model. A match between measured pressure response (PDG data) and re-produced transient pressure histories will then be made, to achieve the ultimate goal of well testing - real time reservoir monitoring, model calibration and reservoir management. This final matching process was named "guided (by neural network model) history matching". Introduction Along with the advent of reliable permanent down-hole gauge available to the oil industry, the importance of continuous reservoir monitoring in mature field and its value delivered by up-to-date information has been recognized as the essential element in modern reservoir management (Athichanagom et al., 1999; Rossi et al., 2000; Chiriti et al., 2001; Ballinas and Owen, 2002; Haddad et al., 2004; Olsen and Nordtvedt, 2005; Weaver et al., 2005; Chorneyko 2006; Frota and Destro, 2006; Olsen and Nordtvedt, 2006). However, the key to ensure the information delivered correctly is a robust analysis method, which can, not only handling the large, noisy data set, but also distil the "message" from the dynamic data for right decision making. Quite a few studies have been released in recent years using wavelet algorithm (Kikani and He, 1998; Soliman et al., 2001; Ouyang and Kikani, 2002; Guan et al., 2004; Olsen and Nordtvedt, 2005; Ribeiro et al., 2006; Zheng and Li, 2007), however, these are more about pre-analysis data processing, the analysis method used after data processing is more or less based on constant terminal rate pressure Draw-Down (DD) solution being regarded as traditional approach.
TX 75083-3836, U.S.A., fax 01-972-952-9435. AbstractThis paper is concerned with the problems of real time reservoir management, simulation while drilling and near well bore modeling. These problems are discussed in terms of work process, feasibility as well as numerical and simulation related aspects. A software tool developed to perform high precision local simulation and at the same time account for global field behaviour is described. Field examples are presented to illustrate the potential and use for simulation while drilling exercises.
This paper describes the work process, data management, analysis tools, architecture and capabilities of integrated reservoir management in an intelligent well environment. An example of a "real-time," web-enabled intelligent-well reservoir management system is provided. The challenges associated with implementing the system are described, and benefits realized from its use are documented. A current hurdle to realizing the economic potential of intelligent wells is the lack of integrated tools and work processes to manage the wealth of production data. The systems described in this paper are designed to overcome this hurdle. The systems are applicable not only to assets employing intelligent wells, but also to conventional assets or those with a mix of intelligent and conventional wells. Technically, an Internet connection transmits "real-time" data from the wellsite to a central host. The latest in industry standards including communication, security, data warehousing, streaming protocols are utilized.Statistical, nodal analysis and predictive modeling techniques are provided and are continually enhanced in the web-environment. These allow the petroleum professional to examine what-if scenarios to quantify the effect of changes in reservoir conditions or well geometry and to recommend optimum settings for enhanced recovery. Introduction With the development of cheaper digital technologies, improved, low-cost data communication, and exponential increases in data storage capacity, the number of computational tools and volume of production data available to the petroleum professional continues to grow. Despite the increased information available, well optimization is typically based on an infrequent, rigorous, manual (well review) process. The process relies on significant manual intervention to interpret the data and to develop injection and production philosophies to optimize oil production and reserves recovery. The advent of intelligent wells compounds this problem. Intelligent wells are designed to acquire real-time production data with increased reservoir resolution and to control the movement of fluids in the wellbore at or close to the reservoir interface without mechanical intervention. As a result, the volume of production data available to the petroleum professional from intelligent wells has increased by several orders of magnitude. Also, the capability to reconfigure the well completion to optimize hydrocarbon production and reserves recovery on an ongoing basis has become a reality. Conventional work processes and data management systems are not equipped to handle the volume of data or use it effectively in a timely manner to extract maximum value of the investment in intelligent well technologies. Reservoir Management - A Historical Perspective Wiggens and Startzman defined petroleum reservoir management as "the application of state-of-the-art technology to a known reservoir system within a given management environment."1 Satter put this in context by stating that the purpose of reservoir management is "to maximize profits from a reservoir by optimizing recovery while minimizing capital investments and operating expenses."2 Prior to the 1980's, reservoir management was based on low-density production data, rudimentary digital reservoir simulation tools, and sequential work processes. The work processes were organized along functional lines with each discipline "handing-off" its contribution to the management process in "assembly line" fashion. Production data, for the most part, was characterized by ‘snap-shot’ well tests - usually generated with a frequency of little more than once per month.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.